Mar 20 08:59:38 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 08:59:38 crc restorecon[4704]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:38 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 08:59:39 crc restorecon[4704]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 08:59:40 crc kubenswrapper[4958]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:59:40 crc kubenswrapper[4958]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 08:59:40 crc kubenswrapper[4958]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:59:40 crc kubenswrapper[4958]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:59:40 crc kubenswrapper[4958]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 08:59:40 crc kubenswrapper[4958]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.134825 4958 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139756 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139787 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139797 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139806 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139814 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139823 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139830 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139838 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139846 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139854 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139861 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139869 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139876 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139884 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139891 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139900 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139907 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139915 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139922 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139938 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139946 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139953 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139963 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139971 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139978 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139986 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.139993 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140001 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140008 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140016 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140026 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140037 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140045 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140054 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140062 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140072 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140080 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140088 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140096 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140104 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140112 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140121 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140129 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140138 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140147 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140157 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140167 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140175 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140185 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140192 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140200 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140208 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140216 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140224 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140231 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140241 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140248 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140256 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140264 4958 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140271 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140278 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140286 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140297 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140306 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140318 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140326 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140333 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140341 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140349 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140360 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.140368 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140549 4958 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140574 4958 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140593 4958 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140661 4958 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140679 4958 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140690 4958 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140702 4958 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140714 4958 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140724 4958 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140733 4958 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140743 4958 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140752 4958 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140762 4958 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140771 4958 flags.go:64] FLAG: --cgroup-root="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140779 4958 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140788 4958 flags.go:64] FLAG: --client-ca-file="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140798 4958 flags.go:64] FLAG: --cloud-config="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140807 4958 flags.go:64] FLAG: --cloud-provider="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140816 4958 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140831 4958 flags.go:64] FLAG: --cluster-domain="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140839 4958 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140849 4958 flags.go:64] FLAG: --config-dir="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140858 4958 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140867 4958 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140879 4958 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140888 4958 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140897 4958 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140906 4958 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140915 4958 flags.go:64] FLAG: --contention-profiling="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140924 4958 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140933 4958 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140942 4958 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140951 4958 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140962 4958 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140973 4958 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140982 4958 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.140991 4958 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141000 4958 flags.go:64] FLAG: --enable-server="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141009 4958 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141020 4958 flags.go:64] FLAG: --event-burst="100" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141029 4958 flags.go:64] FLAG: --event-qps="50" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141040 4958 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141049 4958 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141058 4958 flags.go:64] FLAG: --eviction-hard="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141070 4958 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141078 4958 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141087 4958 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141096 4958 flags.go:64] FLAG: --eviction-soft="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141107 4958 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141116 4958 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141125 4958 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141135 4958 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141144 4958 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141153 4958 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141163 4958 flags.go:64] FLAG: --feature-gates="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141173 4958 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141182 4958 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141191 4958 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141200 4958 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141210 4958 flags.go:64] FLAG: --healthz-port="10248" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141219 4958 flags.go:64] FLAG: --help="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141227 4958 flags.go:64] FLAG: --hostname-override="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141236 4958 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141244 4958 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141254 4958 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141262 4958 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141271 4958 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141279 4958 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141288 4958 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141297 4958 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141307 4958 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141316 4958 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141325 4958 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141334 4958 flags.go:64] FLAG: --kube-reserved="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141343 4958 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141351 4958 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141360 4958 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141369 4958 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141378 4958 flags.go:64] FLAG: --lock-file="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141386 4958 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141404 4958 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141413 4958 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141426 4958 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141435 4958 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141443 4958 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141452 4958 flags.go:64] FLAG: --logging-format="text" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141461 4958 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141470 4958 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141479 4958 flags.go:64] FLAG: --manifest-url="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141488 4958 flags.go:64] FLAG: --manifest-url-header="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141500 4958 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141510 4958 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141521 4958 flags.go:64] FLAG: --max-pods="110" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141530 4958 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141538 4958 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141547 4958 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141557 4958 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141566 4958 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141575 4958 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141583 4958 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141642 4958 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141651 4958 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141660 4958 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141670 4958 flags.go:64] FLAG: --pod-cidr="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141678 4958 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141691 4958 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141704 4958 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141713 4958 flags.go:64] FLAG: --pods-per-core="0" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141722 4958 flags.go:64] FLAG: --port="10250" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141731 4958 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141739 4958 flags.go:64] FLAG: --provider-id="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141749 4958 flags.go:64] FLAG: --qos-reserved="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141761 4958 flags.go:64] FLAG: --read-only-port="10255" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141771 4958 flags.go:64] FLAG: --register-node="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141779 4958 flags.go:64] FLAG: --register-schedulable="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141788 4958 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141804 4958 flags.go:64] FLAG: --registry-burst="10" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141813 4958 flags.go:64] FLAG: --registry-qps="5" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141821 4958 flags.go:64] FLAG: --reserved-cpus="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141830 4958 flags.go:64] FLAG: --reserved-memory="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141842 4958 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141851 4958 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141860 4958 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141869 4958 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141879 4958 flags.go:64] FLAG: --runonce="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141887 4958 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141897 4958 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141906 4958 flags.go:64] FLAG: --seccomp-default="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141915 4958 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141923 4958 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141932 4958 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141941 4958 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141951 4958 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141960 4958 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141969 4958 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141977 4958 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141986 4958 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.141995 4958 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142004 4958 flags.go:64] FLAG: --system-cgroups="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142013 4958 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142029 4958 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142037 4958 flags.go:64] FLAG: --tls-cert-file="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142048 4958 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142068 4958 flags.go:64] FLAG: --tls-min-version="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142080 4958 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142089 4958 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142098 4958 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142107 4958 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142116 4958 flags.go:64] FLAG: --v="2" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142127 4958 flags.go:64] FLAG: --version="false" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142138 4958 flags.go:64] FLAG: --vmodule="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142149 4958 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.142159 4958 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142371 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142381 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142390 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142400 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142411 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142422 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142432 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142440 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142449 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142458 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142467 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142475 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142483 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142491 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142499 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142507 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142515 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142524 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142532 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142539 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142547 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142555 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142562 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142573 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142581 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142589 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142642 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142659 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142670 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142684 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142696 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142706 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142715 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142723 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142730 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142738 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142746 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142753 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142761 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142769 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142777 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142785 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142792 4958 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142800 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142808 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142815 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142823 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142831 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142838 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142846 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142854 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142861 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142869 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142876 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142884 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142896 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142903 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142911 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142918 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142926 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142934 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142944 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142952 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142960 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142967 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142977 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142987 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.142997 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.143005 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.143014 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.143022 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.144101 4958 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.156957 4958 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.157057 4958 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157227 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157256 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157266 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157275 4958 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157285 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157294 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157304 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157313 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157321 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157329 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157337 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157346 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157355 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157364 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157373 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157381 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157391 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157399 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157409 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157421 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157432 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157441 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157449 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157457 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157467 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157476 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157484 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157495 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157506 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157515 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157523 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157532 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157540 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157549 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157562 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157574 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157584 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157623 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157633 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157642 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157650 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157663 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157675 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157686 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157696 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157705 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157716 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157726 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157735 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157742 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157750 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157758 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157766 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157774 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157781 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157789 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157797 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157805 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157813 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157823 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157831 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157839 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157846 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157854 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157862 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157869 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157877 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157884 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157892 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157900 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.157909 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.157924 4958 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158152 4958 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158167 4958 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158175 4958 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158185 4958 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158193 4958 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158202 4958 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158213 4958 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158225 4958 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158235 4958 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158244 4958 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158253 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158262 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158270 4958 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158278 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158287 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158294 4958 feature_gate.go:330] unrecognized feature gate: Example Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158303 4958 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158311 4958 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158319 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158327 4958 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158335 4958 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158343 4958 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158351 4958 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158359 4958 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158369 4958 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158378 4958 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158388 4958 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158398 4958 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158408 4958 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158417 4958 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158425 4958 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158434 4958 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158442 4958 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158450 4958 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158459 4958 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158466 4958 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158474 4958 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158484 4958 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158493 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158502 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158510 4958 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158518 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158528 4958 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158536 4958 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158544 4958 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158552 4958 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158560 4958 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158568 4958 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158576 4958 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158584 4958 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158620 4958 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158629 4958 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158637 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158645 4958 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158653 4958 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158661 4958 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158669 4958 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158677 4958 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158685 4958 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158692 4958 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158700 4958 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158708 4958 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158716 4958 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158723 4958 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158731 4958 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158739 4958 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158746 4958 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158754 4958 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158762 4958 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158770 4958 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.158779 4958 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.158792 4958 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.159097 4958 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.163862 4958 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.170098 4958 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.170282 4958 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.173112 4958 server.go:997] "Starting client certificate rotation" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.173167 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.174262 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.205161 4958 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.209968 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.214587 4958 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.234975 4958 log.go:25] "Validated CRI v1 runtime API" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.275866 4958 log.go:25] "Validated CRI v1 image API" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.278674 4958 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.285572 4958 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-08-54-12-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.285675 4958 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.314475 4958 manager.go:217] Machine: {Timestamp:2026-03-20 08:59:40.311198552 +0000 UTC m=+0.633214540 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654132736 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:4d937261-ad72-4cd3-9e28-1484a891ee0d BootID:f885a277-9b85-4e30-8d86-f10d1510a78a Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730829824 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827068416 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:a0:e1:60 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:a0:e1:60 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:62:f2:9c Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:27:83:6b Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:58:db:ad Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:90:a6:45 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0e:1c:db:f1:56:47 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:a6:0e:31:a2:fe:dd Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654132736 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.314792 4958 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.314975 4958 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.316573 4958 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.316895 4958 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.316956 4958 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.317429 4958 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.317450 4958 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.318049 4958 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.318106 4958 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.318352 4958 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.318476 4958 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.321949 4958 kubelet.go:418] "Attempting to sync node with API server" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.322037 4958 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.322103 4958 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.322127 4958 kubelet.go:324] "Adding apiserver pod source" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.322168 4958 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.331394 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.331621 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.331576 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.331788 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.333984 4958 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.335228 4958 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.337003 4958 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340275 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340313 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340324 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340335 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340353 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340363 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340373 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340389 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340403 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340415 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340430 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.340441 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.342072 4958 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.342751 4958 server.go:1280] "Started kubelet" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.343643 4958 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.343639 4958 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.344421 4958 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 08:59:40 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.344833 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.345260 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.345304 4958 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.345677 4958 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.345696 4958 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.345663 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.345794 4958 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.346471 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="200ms" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.346875 4958 factory.go:55] Registering systemd factory Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.346906 4958 factory.go:221] Registration of the systemd container factory successfully Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.347014 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.347226 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.347319 4958 factory.go:153] Registering CRI-O factory Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.347347 4958 factory.go:221] Registration of the crio container factory successfully Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.347428 4958 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.347461 4958 factory.go:103] Registering Raw factory Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.347489 4958 manager.go:1196] Started watching for new ooms in manager Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.348061 4958 manager.go:319] Starting recovery of all containers Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.348175 4958 server.go:460] "Adding debug handlers to kubelet server" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.371212 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.371716 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.371802 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.371879 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.371962 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372033 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372112 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372204 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372286 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372365 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372442 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372522 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.369561 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372626 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372836 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372915 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372935 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372953 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372972 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.372989 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373005 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373022 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373037 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373052 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373072 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373085 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373100 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373120 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373136 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373151 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373165 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373183 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373199 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373214 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.373230 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.374939 4958 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.374984 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375004 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375022 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375040 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375057 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375079 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375096 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375115 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375133 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375153 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375169 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375217 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375294 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375312 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375336 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375352 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375369 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375385 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375411 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375458 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375480 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375499 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375521 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375551 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375571 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375589 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375624 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375646 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375664 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375711 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375729 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375747 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375765 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375781 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375798 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375816 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375836 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375855 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375874 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375891 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375911 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375930 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375946 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375965 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375981 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.375998 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376019 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376038 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376056 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376073 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376090 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376110 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376127 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376142 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376157 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376173 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376188 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376206 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376222 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376240 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376261 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376277 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376293 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376307 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376324 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376340 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376353 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376369 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376383 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376398 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376437 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376456 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376473 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376491 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376508 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376529 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376548 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376566 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376586 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376624 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376641 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376663 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376680 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376695 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376710 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376724 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376745 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376761 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376778 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376794 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376808 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376828 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376844 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376861 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376885 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376903 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376919 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376938 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376957 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376976 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.376994 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377013 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377029 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377045 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377061 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377079 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377095 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377114 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377134 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377151 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377171 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377189 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377206 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377223 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377243 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377259 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377277 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377297 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377318 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377333 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377349 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377363 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377379 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377398 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377413 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377430 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377447 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377462 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377481 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377499 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377521 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377536 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377553 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377568 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377585 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377624 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377640 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377659 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377675 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377691 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377706 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377728 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377744 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377760 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377775 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377790 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377807 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377824 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377842 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377859 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377878 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377894 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377909 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377926 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377940 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.377986 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378005 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378023 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378040 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378056 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378114 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378132 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378152 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378175 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378194 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378209 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378224 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378267 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378283 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378299 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378316 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378335 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378350 4958 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378366 4958 reconstruct.go:97] "Volume reconstruction finished" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.378377 4958 reconciler.go:26] "Reconciler: start to sync state" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.382339 4958 manager.go:324] Recovery completed Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.395903 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.408030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.408124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.408136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.409401 4958 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.409421 4958 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.409471 4958 state_mem.go:36] "Initialized new in-memory state store" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.427222 4958 policy_none.go:49] "None policy: Start" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.428537 4958 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.428693 4958 state_mem.go:35] "Initializing new in-memory state store" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.431190 4958 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.433403 4958 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.433454 4958 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.433481 4958 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.433634 4958 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.435564 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.435683 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.447052 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.485506 4958 manager.go:334] "Starting Device Plugin manager" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.485755 4958 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.485776 4958 server.go:79] "Starting device plugin registration server" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.486203 4958 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.486222 4958 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.486353 4958 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.486471 4958 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.486484 4958 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.492892 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.534439 4958 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.534586 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.535776 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.535831 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.535844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.535999 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.536254 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.536338 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537129 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537346 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537388 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537415 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537497 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537823 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537876 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.537890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.538136 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.538370 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.538467 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539053 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539161 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539369 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539433 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539458 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539934 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.539969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540154 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540191 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540287 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540366 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.540393 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.541042 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.541064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.541076 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.547112 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="400ms" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581494 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581548 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581578 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581621 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581701 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581789 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581827 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581850 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.581965 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.582025 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.582054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.582074 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.582097 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.582120 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.582140 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.586854 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.588172 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.588214 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.588229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.588262 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.588852 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.65:6443: connect: connection refused" node="crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683113 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683149 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683174 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683205 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683230 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683247 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683266 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683294 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683315 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683335 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683320 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683370 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683493 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683491 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683412 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683334 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683456 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683475 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683656 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683510 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683351 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683815 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683384 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683853 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683881 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683481 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683887 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.683919 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.684001 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.789337 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.790826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.790882 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.790894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.790927 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.791448 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.65:6443: connect: connection refused" node="crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.857909 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.875806 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.883337 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.904806 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.910905 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-2e7823078fc15665c6c8bb70a17a7d11e05547a73fa61faee0ba0ea29a8df390 WatchSource:0}: Error finding container 2e7823078fc15665c6c8bb70a17a7d11e05547a73fa61faee0ba0ea29a8df390: Status 404 returned error can't find the container with id 2e7823078fc15665c6c8bb70a17a7d11e05547a73fa61faee0ba0ea29a8df390 Mar 20 08:59:40 crc kubenswrapper[4958]: I0320 08:59:40.911060 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.913844 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-2cd30b92950f019aca661d8c89867b2a1ea2702e6e85589c8d216bcf6fa46a96 WatchSource:0}: Error finding container 2cd30b92950f019aca661d8c89867b2a1ea2702e6e85589c8d216bcf6fa46a96: Status 404 returned error can't find the container with id 2cd30b92950f019aca661d8c89867b2a1ea2702e6e85589c8d216bcf6fa46a96 Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.919485 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-963725f7940d21954a12fe2d899ebc258809c37cc0672d1be4318dfd9f2ca2bf WatchSource:0}: Error finding container 963725f7940d21954a12fe2d899ebc258809c37cc0672d1be4318dfd9f2ca2bf: Status 404 returned error can't find the container with id 963725f7940d21954a12fe2d899ebc258809c37cc0672d1be4318dfd9f2ca2bf Mar 20 08:59:40 crc kubenswrapper[4958]: W0320 08:59:40.942516 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-988db182429715331600c7157a5b12f82e484fe112f57f0b829a7ac9590a859c WatchSource:0}: Error finding container 988db182429715331600c7157a5b12f82e484fe112f57f0b829a7ac9590a859c: Status 404 returned error can't find the container with id 988db182429715331600c7157a5b12f82e484fe112f57f0b829a7ac9590a859c Mar 20 08:59:40 crc kubenswrapper[4958]: E0320 08:59:40.948005 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="800ms" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.191769 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.193551 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.193638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.193655 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.193697 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.194201 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.65:6443: connect: connection refused" node="crc" Mar 20 08:59:41 crc kubenswrapper[4958]: W0320 08:59:41.332948 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.333072 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.345867 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.438736 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"963725f7940d21954a12fe2d899ebc258809c37cc0672d1be4318dfd9f2ca2bf"} Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.440234 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"2cd30b92950f019aca661d8c89867b2a1ea2702e6e85589c8d216bcf6fa46a96"} Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.441562 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"2e7823078fc15665c6c8bb70a17a7d11e05547a73fa61faee0ba0ea29a8df390"} Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.442967 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"988db182429715331600c7157a5b12f82e484fe112f57f0b829a7ac9590a859c"} Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.445261 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a1346d6a65e8eacd76ac673a8ae6dd6aa163fad4cdaad9a3f2f4407851b2cd71"} Mar 20 08:59:41 crc kubenswrapper[4958]: W0320 08:59:41.748710 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.749102 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.748905 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="1.6s" Mar 20 08:59:41 crc kubenswrapper[4958]: W0320 08:59:41.845362 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.845468 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:41 crc kubenswrapper[4958]: W0320 08:59:41.940321 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.940414 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.995104 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.996952 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.997034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.997053 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:41 crc kubenswrapper[4958]: I0320 08:59:41.997097 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:59:41 crc kubenswrapper[4958]: E0320 08:59:41.997890 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.65:6443: connect: connection refused" node="crc" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.324067 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:59:42 crc kubenswrapper[4958]: E0320 08:59:42.325211 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.346820 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.452485 4958 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="6d8c5f495884d476771ab784e3239ca8d5b93fe9612fdff95bcc089261031113" exitCode=0 Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.452559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"6d8c5f495884d476771ab784e3239ca8d5b93fe9612fdff95bcc089261031113"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.452712 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.454063 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.454133 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.454168 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.455411 4958 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73" exitCode=0 Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.455478 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.455501 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.456928 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.456954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.456963 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.463663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d42563ceeeca2989e69c343c8e480952171423e86ec7da3f23acf67ea844b52b"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.463754 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.463755 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"c8433267d0628369259445adcb5c89c240d4a22a3f5de354dbf6c19b5e7f20fc"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.463966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a68679e5e1d5adf4a50b6d03443dd3e90c5c1388ab02bc7783b1590b372933cc"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.463995 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"485288d9b577950a20ea275f1289685b34ff9cf6debe3c6ddc1170b70ff8ef88"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.467485 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.467527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.467816 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.469262 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e" exitCode=0 Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.469385 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.469363 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.470885 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.470918 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.471173 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.471839 4958 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20" exitCode=0 Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.471928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20"} Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.472003 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.473443 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.473783 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.473860 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.473890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.474675 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.474729 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:42 crc kubenswrapper[4958]: I0320 08:59:42.474749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:42 crc kubenswrapper[4958]: E0320 08:59:42.629063 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.65:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.346734 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:43 crc kubenswrapper[4958]: E0320 08:59:43.349883 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="3.2s" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.477017 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"cb9f621bed7e58d6cee8c63dac105f1d70ce17a24f893b09707baa202619495b"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.477150 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.478556 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.478586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.478622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.487627 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0cb1bd5af090297500b89f5c67d052147fcd6f42f6e49f3fc26d1525998439f0"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.487687 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ff04bc5b55cc7332602831d11a7597b8831883b5dc8d90fbcb7b655ec359fae7"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.487701 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"273ff1ec96f2de36f5ffa6ab14769c02adebefee79570067e577bd3dd785cdba"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.487932 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.489046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.489075 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.489087 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.491928 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.491983 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.492009 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.492031 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.494159 4958 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb" exitCode=0 Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.494278 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.494329 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.494518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb"} Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.495140 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.495179 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.495192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.495631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.495662 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.495678 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.598744 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.600560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.600614 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.600629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:43 crc kubenswrapper[4958]: I0320 08:59:43.600660 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:59:43 crc kubenswrapper[4958]: E0320 08:59:43.601282 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.65:6443: connect: connection refused" node="crc" Mar 20 08:59:43 crc kubenswrapper[4958]: W0320 08:59:43.602493 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:43 crc kubenswrapper[4958]: E0320 08:59:43.602581 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:43 crc kubenswrapper[4958]: W0320 08:59:43.619445 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:43 crc kubenswrapper[4958]: E0320 08:59:43.619574 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:43 crc kubenswrapper[4958]: W0320 08:59:43.669856 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:43 crc kubenswrapper[4958]: E0320 08:59:43.669963 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:44 crc kubenswrapper[4958]: W0320 08:59:44.169075 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:44 crc kubenswrapper[4958]: E0320 08:59:44.169179 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.65:6443: connect: connection refused" logger="UnhandledError" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.346516 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.65:6443: connect: connection refused Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.498373 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.501517 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="70296a0e0518a5bb70e48124c8008e5815b40c5c4a7a3d589da1b7674ec6ba0a" exitCode=255 Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.501682 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"70296a0e0518a5bb70e48124c8008e5815b40c5c4a7a3d589da1b7674ec6ba0a"} Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.501800 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.503213 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.503275 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.503299 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.504218 4958 scope.go:117] "RemoveContainer" containerID="70296a0e0518a5bb70e48124c8008e5815b40c5c4a7a3d589da1b7674ec6ba0a" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.504669 4958 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5" exitCode=0 Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.504792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5"} Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.504853 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.504908 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.504975 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.505886 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.505976 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.506014 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.506025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.506116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.506141 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.506152 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.507159 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.507184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:44 crc kubenswrapper[4958]: I0320 08:59:44.507193 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.511839 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.514487 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c"} Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.514714 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.516313 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.516344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.516356 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.519125 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce"} Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.519169 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60"} Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.519183 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd"} Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.519194 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2"} Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.519223 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.519980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.520009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.520019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.771061 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.771962 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.775342 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.775406 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:45 crc kubenswrapper[4958]: I0320 08:59:45.775423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.352336 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.406853 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.527427 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b"} Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.527497 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.528043 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.528069 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.528366 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.528408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.528423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.529014 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.529044 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.529053 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.797502 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.797742 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.798956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.798999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.799009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.801509 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.802442 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.802482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.802496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:46 crc kubenswrapper[4958]: I0320 08:59:46.802518 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.211270 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.530531 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.530619 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.530846 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.532051 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.532088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.532050 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.532100 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.532113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:47 crc kubenswrapper[4958]: I0320 08:59:47.532131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:48 crc kubenswrapper[4958]: I0320 08:59:48.536689 4958 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 08:59:48 crc kubenswrapper[4958]: I0320 08:59:48.537840 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:48 crc kubenswrapper[4958]: I0320 08:59:48.538959 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:48 crc kubenswrapper[4958]: I0320 08:59:48.539015 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:48 crc kubenswrapper[4958]: I0320 08:59:48.539031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:48 crc kubenswrapper[4958]: I0320 08:59:48.641991 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.434525 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.434789 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.435965 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.436034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.436053 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.537464 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.538677 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.538725 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.538736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.543879 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.544133 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.545162 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.545198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.545213 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.797973 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:59:49 crc kubenswrapper[4958]: I0320 08:59:49.798073 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 08:59:50 crc kubenswrapper[4958]: I0320 08:59:50.080429 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:50 crc kubenswrapper[4958]: I0320 08:59:50.080696 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:50 crc kubenswrapper[4958]: I0320 08:59:50.082222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:50 crc kubenswrapper[4958]: I0320 08:59:50.082276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:50 crc kubenswrapper[4958]: I0320 08:59:50.082287 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:50 crc kubenswrapper[4958]: E0320 08:59:50.493011 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.246549 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.247352 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.249029 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.249078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.249086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.251973 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.545989 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.548121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.548161 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.548174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:52 crc kubenswrapper[4958]: I0320 08:59:52.550649 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 08:59:53 crc kubenswrapper[4958]: I0320 08:59:53.549049 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:53 crc kubenswrapper[4958]: I0320 08:59:53.550853 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:53 crc kubenswrapper[4958]: I0320 08:59:53.550951 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:53 crc kubenswrapper[4958]: I0320 08:59:53.550982 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:55 crc kubenswrapper[4958]: I0320 08:59:55.347527 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.497885 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.498780 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 08:59:55 crc kubenswrapper[4958]: W0320 08:59:55.499257 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.499311 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:59:55 crc kubenswrapper[4958]: W0320 08:59:55.501066 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.501116 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:59:55 crc kubenswrapper[4958]: I0320 08:59:55.501811 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 08:59:55 crc kubenswrapper[4958]: I0320 08:59:55.501921 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 08:59:55 crc kubenswrapper[4958]: W0320 08:59:55.502834 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.502898 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.505121 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.506686 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 08:59:55 crc kubenswrapper[4958]: I0320 08:59:55.507561 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 08:59:55 crc kubenswrapper[4958]: I0320 08:59:55.507658 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 08:59:55 crc kubenswrapper[4958]: W0320 08:59:55.507774 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z Mar 20 08:59:55 crc kubenswrapper[4958]: E0320 08:59:55.507844 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:55Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.350724 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:56Z is after 2026-02-23T05:33:13Z Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.360347 4958 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]log ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]etcd ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/openshift.io-api-request-count-filter ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/openshift.io-startkubeinformers ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/openshift.io-openshift-apiserver-reachable ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/openshift.io-oauth-apiserver-reachable ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/priority-and-fairness-config-consumer ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/priority-and-fairness-filter ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-apiextensions-informers ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-apiextensions-controllers ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/crd-informer-synced ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-system-namespaces-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-cluster-authentication-info-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-kube-apiserver-identity-lease-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-kube-apiserver-identity-lease-garbage-collector ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-legacy-token-tracking-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-service-ip-repair-controllers ok Mar 20 08:59:56 crc kubenswrapper[4958]: [-]poststarthook/rbac/bootstrap-roles failed: reason withheld Mar 20 08:59:56 crc kubenswrapper[4958]: [-]poststarthook/scheduling/bootstrap-system-priority-classes failed: reason withheld Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/priority-and-fairness-config-producer ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/bootstrap-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/aggregator-reload-proxy-client-cert ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/start-kube-aggregator-informers ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-status-local-available-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-status-remote-available-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-registration-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-wait-for-first-sync ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-discovery-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/kube-apiserver-autoregistration ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]autoregister-completion ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-openapi-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: [+]poststarthook/apiservice-openapiv3-controller ok Mar 20 08:59:56 crc kubenswrapper[4958]: livez check failed Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.362462 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.566831 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.567550 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.576108 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" exitCode=255 Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.576168 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c"} Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.576260 4958 scope.go:117] "RemoveContainer" containerID="70296a0e0518a5bb70e48124c8008e5815b40c5c4a7a3d589da1b7674ec6ba0a" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.576407 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.577775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.577805 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.577814 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:56 crc kubenswrapper[4958]: I0320 08:59:56.578339 4958 scope.go:117] "RemoveContainer" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" Mar 20 08:59:56 crc kubenswrapper[4958]: E0320 08:59:56.578510 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 08:59:57 crc kubenswrapper[4958]: I0320 08:59:57.348682 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:57Z is after 2026-02-23T05:33:13Z Mar 20 08:59:57 crc kubenswrapper[4958]: I0320 08:59:57.580813 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 08:59:58 crc kubenswrapper[4958]: I0320 08:59:58.348497 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:58Z is after 2026-02-23T05:33:13Z Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.348823 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T08:59:59Z is after 2026-02-23T05:33:13Z Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.572477 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.573168 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.574918 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.574954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.574965 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.587284 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.589150 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.590482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.590542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.590554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.798135 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 08:59:59 crc kubenswrapper[4958]: I0320 08:59:59.798242 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:00:00 crc kubenswrapper[4958]: I0320 09:00:00.348394 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:00Z is after 2026-02-23T05:33:13Z Mar 20 09:00:00 crc kubenswrapper[4958]: E0320 09:00:00.493185 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.350804 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:01Z is after 2026-02-23T05:33:13Z Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.357097 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.357276 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.358506 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.358546 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.358562 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.359283 4958 scope.go:117] "RemoveContainer" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" Mar 20 09:00:01 crc kubenswrapper[4958]: E0320 09:00:01.359452 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.361357 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.594713 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.595996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.596114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.596184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.596843 4958 scope.go:117] "RemoveContainer" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" Mar 20 09:00:01 crc kubenswrapper[4958]: E0320 09:00:01.597079 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:01 crc kubenswrapper[4958]: E0320 09:00:01.903938 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:01Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.907133 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.908460 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.908530 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.908556 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:01 crc kubenswrapper[4958]: I0320 09:00:01.908798 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:01 crc kubenswrapper[4958]: E0320 09:00:01.913794 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:01Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 09:00:02 crc kubenswrapper[4958]: W0320 09:00:02.155737 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:02Z is after 2026-02-23T05:33:13Z Mar 20 09:00:02 crc kubenswrapper[4958]: E0320 09:00:02.155859 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:02Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.349758 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:02Z is after 2026-02-23T05:33:13Z Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.890911 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.891188 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.892780 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.892834 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.892850 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:02 crc kubenswrapper[4958]: I0320 09:00:02.893764 4958 scope.go:117] "RemoveContainer" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" Mar 20 09:00:02 crc kubenswrapper[4958]: E0320 09:00:02.894025 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:03 crc kubenswrapper[4958]: I0320 09:00:03.350001 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:03Z is after 2026-02-23T05:33:13Z Mar 20 09:00:04 crc kubenswrapper[4958]: W0320 09:00:04.035745 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:04Z is after 2026-02-23T05:33:13Z Mar 20 09:00:04 crc kubenswrapper[4958]: E0320 09:00:04.035836 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:04 crc kubenswrapper[4958]: I0320 09:00:04.193075 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 09:00:04 crc kubenswrapper[4958]: E0320 09:00:04.196682 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:04Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:04 crc kubenswrapper[4958]: I0320 09:00:04.348304 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:04Z is after 2026-02-23T05:33:13Z Mar 20 09:00:05 crc kubenswrapper[4958]: I0320 09:00:05.348116 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:05Z is after 2026-02-23T05:33:13Z Mar 20 09:00:05 crc kubenswrapper[4958]: E0320 09:00:05.508434 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:05Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:06 crc kubenswrapper[4958]: I0320 09:00:06.348890 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:06Z is after 2026-02-23T05:33:13Z Mar 20 09:00:06 crc kubenswrapper[4958]: W0320 09:00:06.909668 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:06Z is after 2026-02-23T05:33:13Z Mar 20 09:00:06 crc kubenswrapper[4958]: E0320 09:00:06.909785 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:06Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:07 crc kubenswrapper[4958]: I0320 09:00:07.350203 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:07Z is after 2026-02-23T05:33:13Z Mar 20 09:00:07 crc kubenswrapper[4958]: W0320 09:00:07.456809 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:07Z is after 2026-02-23T05:33:13Z Mar 20 09:00:07 crc kubenswrapper[4958]: E0320 09:00:07.457249 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:08 crc kubenswrapper[4958]: I0320 09:00:08.350683 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:08Z is after 2026-02-23T05:33:13Z Mar 20 09:00:08 crc kubenswrapper[4958]: E0320 09:00:08.909752 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:08Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 09:00:08 crc kubenswrapper[4958]: I0320 09:00:08.914991 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:08 crc kubenswrapper[4958]: I0320 09:00:08.916375 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:08 crc kubenswrapper[4958]: I0320 09:00:08.916435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:08 crc kubenswrapper[4958]: I0320 09:00:08.916446 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:08 crc kubenswrapper[4958]: I0320 09:00:08.916489 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:08 crc kubenswrapper[4958]: E0320 09:00:08.921416 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:08Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.348889 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:09Z is after 2026-02-23T05:33:13Z Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.798523 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.798715 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.798824 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.799026 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.800482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.800549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.800563 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.801091 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"a68679e5e1d5adf4a50b6d03443dd3e90c5c1388ab02bc7783b1590b372933cc"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 09:00:09 crc kubenswrapper[4958]: I0320 09:00:09.801264 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://a68679e5e1d5adf4a50b6d03443dd3e90c5c1388ab02bc7783b1590b372933cc" gracePeriod=30 Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.348512 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:10Z is after 2026-02-23T05:33:13Z Mar 20 09:00:10 crc kubenswrapper[4958]: E0320 09:00:10.493428 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.620774 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.621181 4958 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="a68679e5e1d5adf4a50b6d03443dd3e90c5c1388ab02bc7783b1590b372933cc" exitCode=255 Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.621233 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"a68679e5e1d5adf4a50b6d03443dd3e90c5c1388ab02bc7783b1590b372933cc"} Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.621283 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"79a45e31d9a110cf93cb0a64d57274448e46eb9eda8456969224a588d9d9c96b"} Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.621466 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.622488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.622551 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:10 crc kubenswrapper[4958]: I0320 09:00:10.622563 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:11 crc kubenswrapper[4958]: I0320 09:00:11.348531 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:11Z is after 2026-02-23T05:33:13Z Mar 20 09:00:12 crc kubenswrapper[4958]: I0320 09:00:12.349414 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:12Z is after 2026-02-23T05:33:13Z Mar 20 09:00:13 crc kubenswrapper[4958]: I0320 09:00:13.348506 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:13Z is after 2026-02-23T05:33:13Z Mar 20 09:00:14 crc kubenswrapper[4958]: I0320 09:00:14.351111 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:14Z is after 2026-02-23T05:33:13Z Mar 20 09:00:15 crc kubenswrapper[4958]: I0320 09:00:15.349230 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:15Z is after 2026-02-23T05:33:13Z Mar 20 09:00:15 crc kubenswrapper[4958]: E0320 09:00:15.513434 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:15Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:15 crc kubenswrapper[4958]: E0320 09:00:15.914195 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:15Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 09:00:15 crc kubenswrapper[4958]: I0320 09:00:15.922313 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:15 crc kubenswrapper[4958]: I0320 09:00:15.924001 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:15 crc kubenswrapper[4958]: I0320 09:00:15.924048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:15 crc kubenswrapper[4958]: I0320 09:00:15.924060 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:15 crc kubenswrapper[4958]: I0320 09:00:15.924095 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:15 crc kubenswrapper[4958]: E0320 09:00:15.929145 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:15Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.351396 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:16Z is after 2026-02-23T05:33:13Z Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.434699 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.436413 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.436462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.436471 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.437182 4958 scope.go:117] "RemoveContainer" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.640582 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.642970 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb"} Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.643178 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.644179 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.644234 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.644246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.797902 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.798063 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.799151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.799192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:16 crc kubenswrapper[4958]: I0320 09:00:16.799208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.348647 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:17Z is after 2026-02-23T05:33:13Z Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.648519 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.651530 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.654320 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb" exitCode=255 Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.654363 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb"} Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.654404 4958 scope.go:117] "RemoveContainer" containerID="acf671e8e0fabfb737ab643b56c39e5e1ebff92d66bef7afe26d0a5ce485f53c" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.654627 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.655931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.655954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.655968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:17 crc kubenswrapper[4958]: I0320 09:00:17.656446 4958 scope.go:117] "RemoveContainer" containerID="ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb" Mar 20 09:00:17 crc kubenswrapper[4958]: E0320 09:00:17.656638 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.350830 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:18Z is after 2026-02-23T05:33:13Z Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.642363 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.659679 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.662449 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.663341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.663490 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.663577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:18 crc kubenswrapper[4958]: I0320 09:00:18.664220 4958 scope.go:117] "RemoveContainer" containerID="ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb" Mar 20 09:00:18 crc kubenswrapper[4958]: E0320 09:00:18.664504 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:19 crc kubenswrapper[4958]: I0320 09:00:19.348709 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:19Z is after 2026-02-23T05:33:13Z Mar 20 09:00:19 crc kubenswrapper[4958]: I0320 09:00:19.798956 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 09:00:19 crc kubenswrapper[4958]: I0320 09:00:19.799101 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.081190 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.081424 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.083040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.083104 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.083116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.349235 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:20Z is after 2026-02-23T05:33:13Z Mar 20 09:00:20 crc kubenswrapper[4958]: E0320 09:00:20.493958 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:00:20 crc kubenswrapper[4958]: I0320 09:00:20.517339 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 09:00:20 crc kubenswrapper[4958]: E0320 09:00:20.522980 4958 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:20Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:20 crc kubenswrapper[4958]: E0320 09:00:20.524242 4958 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 20 09:00:21 crc kubenswrapper[4958]: I0320 09:00:21.348218 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:21Z is after 2026-02-23T05:33:13Z Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.351408 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:22Z is after 2026-02-23T05:33:13Z Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.890502 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.890807 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.892483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.892544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.892561 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.893419 4958 scope.go:117] "RemoveContainer" containerID="ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb" Mar 20 09:00:22 crc kubenswrapper[4958]: E0320 09:00:22.893785 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:22 crc kubenswrapper[4958]: E0320 09:00:22.919218 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:22Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.929556 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.930981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.931031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.931043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:22 crc kubenswrapper[4958]: I0320 09:00:22.931073 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:22 crc kubenswrapper[4958]: E0320 09:00:22.934062 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:22Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 09:00:23 crc kubenswrapper[4958]: I0320 09:00:23.348349 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:23Z is after 2026-02-23T05:33:13Z Mar 20 09:00:24 crc kubenswrapper[4958]: I0320 09:00:24.350345 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:24Z is after 2026-02-23T05:33:13Z Mar 20 09:00:25 crc kubenswrapper[4958]: W0320 09:00:25.008512 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:25Z is after 2026-02-23T05:33:13Z Mar 20 09:00:25 crc kubenswrapper[4958]: E0320 09:00:25.008644 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:25 crc kubenswrapper[4958]: I0320 09:00:25.349011 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:25Z is after 2026-02-23T05:33:13Z Mar 20 09:00:25 crc kubenswrapper[4958]: W0320 09:00:25.393174 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:25Z is after 2026-02-23T05:33:13Z Mar 20 09:00:25 crc kubenswrapper[4958]: E0320 09:00:25.393317 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:25Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:25 crc kubenswrapper[4958]: E0320 09:00:25.519642 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:25Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:26 crc kubenswrapper[4958]: I0320 09:00:26.350681 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:26Z is after 2026-02-23T05:33:13Z Mar 20 09:00:27 crc kubenswrapper[4958]: W0320 09:00:27.307017 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:27Z is after 2026-02-23T05:33:13Z Mar 20 09:00:27 crc kubenswrapper[4958]: E0320 09:00:27.307115 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:27 crc kubenswrapper[4958]: I0320 09:00:27.348978 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:27Z is after 2026-02-23T05:33:13Z Mar 20 09:00:27 crc kubenswrapper[4958]: W0320 09:00:27.351942 4958 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:27Z is after 2026-02-23T05:33:13Z Mar 20 09:00:27 crc kubenswrapper[4958]: E0320 09:00:27.352057 4958 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 09:00:28 crc kubenswrapper[4958]: I0320 09:00:28.349361 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:28Z is after 2026-02-23T05:33:13Z Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.348160 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:29Z is after 2026-02-23T05:33:13Z Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.798845 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.798986 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:00:29 crc kubenswrapper[4958]: E0320 09:00:29.923747 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:29Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.934947 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.936383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.936458 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.936511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:29 crc kubenswrapper[4958]: I0320 09:00:29.936553 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:29 crc kubenswrapper[4958]: E0320 09:00:29.941801 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 09:00:30 crc kubenswrapper[4958]: I0320 09:00:30.349363 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:30Z is after 2026-02-23T05:33:13Z Mar 20 09:00:30 crc kubenswrapper[4958]: E0320 09:00:30.494698 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:00:31 crc kubenswrapper[4958]: I0320 09:00:31.348805 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:31Z is after 2026-02-23T05:33:13Z Mar 20 09:00:32 crc kubenswrapper[4958]: I0320 09:00:32.349632 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:32Z is after 2026-02-23T05:33:13Z Mar 20 09:00:32 crc kubenswrapper[4958]: I0320 09:00:32.870184 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 09:00:32 crc kubenswrapper[4958]: I0320 09:00:32.870430 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:32 crc kubenswrapper[4958]: I0320 09:00:32.871834 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:32 crc kubenswrapper[4958]: I0320 09:00:32.871909 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:32 crc kubenswrapper[4958]: I0320 09:00:32.871924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:33 crc kubenswrapper[4958]: I0320 09:00:33.350461 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:33Z is after 2026-02-23T05:33:13Z Mar 20 09:00:34 crc kubenswrapper[4958]: I0320 09:00:34.349800 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:34Z is after 2026-02-23T05:33:13Z Mar 20 09:00:35 crc kubenswrapper[4958]: I0320 09:00:35.349133 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:35Z is after 2026-02-23T05:33:13Z Mar 20 09:00:35 crc kubenswrapper[4958]: E0320 09:00:35.527515 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:35Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:36 crc kubenswrapper[4958]: I0320 09:00:36.348650 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:36Z is after 2026-02-23T05:33:13Z Mar 20 09:00:36 crc kubenswrapper[4958]: E0320 09:00:36.928269 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:36Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 09:00:36 crc kubenswrapper[4958]: I0320 09:00:36.942390 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:36 crc kubenswrapper[4958]: I0320 09:00:36.944361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:36 crc kubenswrapper[4958]: I0320 09:00:36.944430 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:36 crc kubenswrapper[4958]: I0320 09:00:36.944445 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:36 crc kubenswrapper[4958]: I0320 09:00:36.944482 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:36 crc kubenswrapper[4958]: E0320 09:00:36.948445 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:36Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 09:00:37 crc kubenswrapper[4958]: I0320 09:00:37.349133 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:37Z is after 2026-02-23T05:33:13Z Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.349066 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:38Z is after 2026-02-23T05:33:13Z Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.434464 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.435879 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.435955 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.435969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.436804 4958 scope.go:117] "RemoveContainer" containerID="ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.721413 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.723334 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263"} Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.723533 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.724376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.724398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:38 crc kubenswrapper[4958]: I0320 09:00:38.724407 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.349628 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:13Z Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.727493 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.727924 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.730156 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" exitCode=255 Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.730209 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263"} Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.730269 4958 scope.go:117] "RemoveContainer" containerID="ac2700f779b8a81faec4f60f13d2449f8f02fc0f3864f1b6f5fc16be35abfdbb" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.730436 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.731746 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.731800 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.731815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.732662 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:00:39 crc kubenswrapper[4958]: E0320 09:00:39.732887 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.798518 4958 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.798681 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.798764 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.798951 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.800111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.800135 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.800143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.800631 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"79a45e31d9a110cf93cb0a64d57274448e46eb9eda8456969224a588d9d9c96b"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 09:00:39 crc kubenswrapper[4958]: I0320 09:00:39.800727 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://79a45e31d9a110cf93cb0a64d57274448e46eb9eda8456969224a588d9d9c96b" gracePeriod=30 Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.349488 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:40Z is after 2026-02-23T05:33:13Z Mar 20 09:00:40 crc kubenswrapper[4958]: E0320 09:00:40.495155 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.735126 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.738942 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.740233 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.740590 4958 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="79a45e31d9a110cf93cb0a64d57274448e46eb9eda8456969224a588d9d9c96b" exitCode=255 Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.740630 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"79a45e31d9a110cf93cb0a64d57274448e46eb9eda8456969224a588d9d9c96b"} Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.740681 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"f883f0a9bc518bf7cdbdcef43df507c8e7162636bb3bcfe5dcacd54b3fd8dfed"} Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.740703 4958 scope.go:117] "RemoveContainer" containerID="a68679e5e1d5adf4a50b6d03443dd3e90c5c1388ab02bc7783b1590b372933cc" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.740893 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.741804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.741839 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:40 crc kubenswrapper[4958]: I0320 09:00:40.741851 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:41 crc kubenswrapper[4958]: I0320 09:00:41.350910 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:41 crc kubenswrapper[4958]: I0320 09:00:41.746354 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.349634 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.890375 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.891314 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.892994 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.893043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.893059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:42 crc kubenswrapper[4958]: I0320 09:00:42.893926 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:00:42 crc kubenswrapper[4958]: E0320 09:00:42.894163 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:43 crc kubenswrapper[4958]: I0320 09:00:43.351080 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:43 crc kubenswrapper[4958]: E0320 09:00:43.933576 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 09:00:43 crc kubenswrapper[4958]: I0320 09:00:43.949025 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:43 crc kubenswrapper[4958]: I0320 09:00:43.950890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:43 crc kubenswrapper[4958]: I0320 09:00:43.951092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:43 crc kubenswrapper[4958]: I0320 09:00:43.951234 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:43 crc kubenswrapper[4958]: I0320 09:00:43.951390 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:43 crc kubenswrapper[4958]: E0320 09:00:43.957712 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 09:00:44 crc kubenswrapper[4958]: I0320 09:00:44.352282 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:45 crc kubenswrapper[4958]: I0320 09:00:45.349740 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.535891 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e810956df0836 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,LastTimestamp:2026-03-20 08:59:40.342700086 +0000 UTC m=+0.664716044,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.542814 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.549203 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.551329 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.555440 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81096065b937 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.502522167 +0000 UTC m=+0.824538115,LastTimestamp:2026-03-20 08:59:40.502522167 +0000 UTC m=+0.824538115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.560970 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.535815663 +0000 UTC m=+0.857831611,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.566188 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.535839893 +0000 UTC m=+0.857855851,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.570955 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5ee4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.535848993 +0000 UTC m=+0.857864951,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.575475 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.537027517 +0000 UTC m=+0.859043465,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.581492 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.537037907 +0000 UTC m=+0.859053865,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.586524 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5ee4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.537046177 +0000 UTC m=+0.859062135,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.591796 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.537369181 +0000 UTC m=+0.859385139,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.596913 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.537383881 +0000 UTC m=+0.859399839,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.603174 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5ee4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.537394971 +0000 UTC m=+0.859410929,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.608018 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.537847767 +0000 UTC m=+0.859863725,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.615224 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.537886417 +0000 UTC m=+0.859902375,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.619998 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5ee4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.537895897 +0000 UTC m=+0.859911855,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.624874 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.539075451 +0000 UTC m=+0.861091429,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.629335 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.539106452 +0000 UTC m=+0.861122410,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.634965 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5ee4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.539119262 +0000 UTC m=+0.861135220,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.639425 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.539136432 +0000 UTC m=+0.861152390,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.644182 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.539151232 +0000 UTC m=+0.861167190,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.649208 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5ee4b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5ee4b default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408163915 +0000 UTC m=+0.730179873,LastTimestamp:2026-03-20 08:59:40.539167092 +0000 UTC m=+0.861183050,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.654988 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac5070a\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac5070a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408104714 +0000 UTC m=+0.730120692,LastTimestamp:2026-03-20 08:59:40.539952791 +0000 UTC m=+0.861968749,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.659916 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e81095ac573de\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e81095ac573de default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.408132574 +0000 UTC m=+0.730148532,LastTimestamp:2026-03-20 08:59:40.539964911 +0000 UTC m=+0.861980869,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.669076 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8109792b2ba8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.91811524 +0000 UTC m=+1.240131198,LastTimestamp:2026-03-20 08:59:40.91811524 +0000 UTC m=+1.240131198,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.673791 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8109792dbb7c openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.918283132 +0000 UTC m=+1.240299090,LastTimestamp:2026-03-20 08:59:40.918283132 +0000 UTC m=+1.240299090,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.677749 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810979a85ab5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.926319285 +0000 UTC m=+1.248335243,LastTimestamp:2026-03-20 08:59:40.926319285 +0000 UTC m=+1.248335243,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.681642 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e81097aa37cac openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.942777516 +0000 UTC m=+1.264793474,LastTimestamp:2026-03-20 08:59:40.942777516 +0000 UTC m=+1.264793474,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.685358 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e81097b388a11 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:40.952545809 +0000 UTC m=+1.274561767,LastTimestamp:2026-03-20 08:59:40.952545809 +0000 UTC m=+1.274561767,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.690027 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e81099e8fd0e7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.545468135 +0000 UTC m=+1.867484103,LastTimestamp:2026-03-20 08:59:41.545468135 +0000 UTC m=+1.867484103,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.693423 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e81099eaa5b84 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.547207556 +0000 UTC m=+1.869223554,LastTimestamp:2026-03-20 08:59:41.547207556 +0000 UTC m=+1.869223554,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.697927 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e81099f40647a openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.55704025 +0000 UTC m=+1.879056208,LastTimestamp:2026-03-20 08:59:41.55704025 +0000 UTC m=+1.879056208,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.702905 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e81099f96b86c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.562697836 +0000 UTC m=+1.884713794,LastTimestamp:2026-03-20 08:59:41.562697836 +0000 UTC m=+1.884713794,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.706370 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e81099f99f4fc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.562909948 +0000 UTC m=+1.884925906,LastTimestamp:2026-03-20 08:59:41.562909948 +0000 UTC m=+1.884925906,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.710538 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e81099fa6231b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.563708187 +0000 UTC m=+1.885724145,LastTimestamp:2026-03-20 08:59:41.563708187 +0000 UTC m=+1.885724145,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.714248 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e81099fa71145 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.563769157 +0000 UTC m=+1.885785105,LastTimestamp:2026-03-20 08:59:41.563769157 +0000 UTC m=+1.885785105,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.717702 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e81099fc48502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.56569933 +0000 UTC m=+1.887715288,LastTimestamp:2026-03-20 08:59:41.56569933 +0000 UTC m=+1.887715288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.720930 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e81099fd4c14f openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.566763343 +0000 UTC m=+1.888779301,LastTimestamp:2026-03-20 08:59:41.566763343 +0000 UTC m=+1.888779301,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.724244 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8109a117efe6 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.587943398 +0000 UTC m=+1.909959356,LastTimestamp:2026-03-20 08:59:41.587943398 +0000 UTC m=+1.909959356,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.728189 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109a13726cf openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.589989071 +0000 UTC m=+1.912005029,LastTimestamp:2026-03-20 08:59:41.589989071 +0000 UTC m=+1.912005029,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.731238 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109b1a24995 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.865445781 +0000 UTC m=+2.187461749,LastTimestamp:2026-03-20 08:59:41.865445781 +0000 UTC m=+2.187461749,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.735363 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109b250a9b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.876873653 +0000 UTC m=+2.198889611,LastTimestamp:2026-03-20 08:59:41.876873653 +0000 UTC m=+2.198889611,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.738367 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109b2651f09 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.878214409 +0000 UTC m=+2.200230377,LastTimestamp:2026-03-20 08:59:41.878214409 +0000 UTC m=+2.200230377,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.742697 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109bd19ac9d openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.057819293 +0000 UTC m=+2.379835261,LastTimestamp:2026-03-20 08:59:42.057819293 +0000 UTC m=+2.379835261,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.746579 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109bde57a93 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.071175827 +0000 UTC m=+2.393191785,LastTimestamp:2026-03-20 08:59:42.071175827 +0000 UTC m=+2.393191785,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.750703 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109bdf92141 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.072463681 +0000 UTC m=+2.394479639,LastTimestamp:2026-03-20 08:59:42.072463681 +0000 UTC m=+2.394479639,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.754288 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109c8300dcd openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.243835341 +0000 UTC m=+2.565851299,LastTimestamp:2026-03-20 08:59:42.243835341 +0000 UTC m=+2.565851299,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.758521 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109c8c93872 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.253873266 +0000 UTC m=+2.575889224,LastTimestamp:2026-03-20 08:59:42.253873266 +0000 UTC m=+2.575889224,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.760896 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8109d4f7a6ff openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.458242815 +0000 UTC m=+2.780258813,LastTimestamp:2026-03-20 08:59:42.458242815 +0000 UTC m=+2.780258813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.766754 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109d599ab05 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.468860677 +0000 UTC m=+2.790876675,LastTimestamp:2026-03-20 08:59:42.468860677 +0000 UTC m=+2.790876675,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.772559 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109d5dc9b59 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.473247577 +0000 UTC m=+2.795263525,LastTimestamp:2026-03-20 08:59:42.473247577 +0000 UTC m=+2.795263525,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.778568 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8109d601375f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.475646815 +0000 UTC m=+2.797662773,LastTimestamp:2026-03-20 08:59:42.475646815 +0000 UTC m=+2.797662773,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.784734 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109e3d312d5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.707503829 +0000 UTC m=+3.029519787,LastTimestamp:2026-03-20 08:59:42.707503829 +0000 UTC m=+3.029519787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.789740 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8109e3dbd247 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.708077127 +0000 UTC m=+3.030093085,LastTimestamp:2026-03-20 08:59:42.708077127 +0000 UTC m=+3.030093085,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.793651 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8109e4520bbd openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.715825085 +0000 UTC m=+3.037841043,LastTimestamp:2026-03-20 08:59:42.715825085 +0000 UTC m=+3.037841043,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.798691 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109e459cb48 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.716332872 +0000 UTC m=+3.038348830,LastTimestamp:2026-03-20 08:59:42.716332872 +0000 UTC m=+3.038348830,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.804454 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109e4833137 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.719045943 +0000 UTC m=+3.041061901,LastTimestamp:2026-03-20 08:59:42.719045943 +0000 UTC m=+3.041061901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.808685 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109e493d7af openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.720137135 +0000 UTC m=+3.042153093,LastTimestamp:2026-03-20 08:59:42.720137135 +0000 UTC m=+3.042153093,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.813542 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8109e4b0780d openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.722013197 +0000 UTC m=+3.044029155,LastTimestamp:2026-03-20 08:59:42.722013197 +0000 UTC m=+3.044029155,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.818023 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109e65e3350 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.75017608 +0000 UTC m=+3.072192038,LastTimestamp:2026-03-20 08:59:42.75017608 +0000 UTC m=+3.072192038,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.822285 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109e66e7907 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.751242503 +0000 UTC m=+3.073258461,LastTimestamp:2026-03-20 08:59:42.751242503 +0000 UTC m=+3.073258461,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.828120 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8109e6d18843 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.757734467 +0000 UTC m=+3.079750425,LastTimestamp:2026-03-20 08:59:42.757734467 +0000 UTC m=+3.079750425,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.834547 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109f44c7b8b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.983895947 +0000 UTC m=+3.305911905,LastTimestamp:2026-03-20 08:59:42.983895947 +0000 UTC m=+3.305911905,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.839813 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109f44d71d5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:42.983958997 +0000 UTC m=+3.305974955,LastTimestamp:2026-03-20 08:59:42.983958997 +0000 UTC m=+3.305974955,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.845342 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109f554f1c3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.001227715 +0000 UTC m=+3.323243693,LastTimestamp:2026-03-20 08:59:43.001227715 +0000 UTC m=+3.323243693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.850936 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8109f5667a5f openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.002376799 +0000 UTC m=+3.324392757,LastTimestamp:2026-03-20 08:59:43.002376799 +0000 UTC m=+3.324392757,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.856004 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109f585a734 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.004419892 +0000 UTC m=+3.326435850,LastTimestamp:2026-03-20 08:59:43.004419892 +0000 UTC m=+3.326435850,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.860718 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8109f593ecd7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.005355223 +0000 UTC m=+3.327371181,LastTimestamp:2026-03-20 08:59:43.005355223 +0000 UTC m=+3.327371181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.865188 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e810a0208b0f2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.214334194 +0000 UTC m=+3.536350152,LastTimestamp:2026-03-20 08:59:43.214334194 +0000 UTC m=+3.536350152,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.870401 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a021dcc2c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.21571742 +0000 UTC m=+3.537733378,LastTimestamp:2026-03-20 08:59:43.21571742 +0000 UTC m=+3.537733378,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.874494 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e810a031148ce openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.231674574 +0000 UTC m=+3.553690532,LastTimestamp:2026-03-20 08:59:43.231674574 +0000 UTC m=+3.553690532,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.879789 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a034232b5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.234880181 +0000 UTC m=+3.556896139,LastTimestamp:2026-03-20 08:59:43.234880181 +0000 UTC m=+3.556896139,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.885022 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a03541411 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.236051985 +0000 UTC m=+3.558067953,LastTimestamp:2026-03-20 08:59:43.236051985 +0000 UTC m=+3.558067953,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.890030 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a0ddaa53e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.412643134 +0000 UTC m=+3.734659092,LastTimestamp:2026-03-20 08:59:43.412643134 +0000 UTC m=+3.734659092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.895067 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a0ea4f408 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.425901576 +0000 UTC m=+3.747917534,LastTimestamp:2026-03-20 08:59:43.425901576 +0000 UTC m=+3.747917534,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.904674 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a0eb79c74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.42712434 +0000 UTC m=+3.749140288,LastTimestamp:2026-03-20 08:59:43.42712434 +0000 UTC m=+3.749140288,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.911054 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a13053c00 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.49932032 +0000 UTC m=+3.821336268,LastTimestamp:2026-03-20 08:59:43.49932032 +0000 UTC m=+3.821336268,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.915473 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a1aabfd50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.627689296 +0000 UTC m=+3.949705254,LastTimestamp:2026-03-20 08:59:43.627689296 +0000 UTC m=+3.949705254,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.920764 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a1b663a07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.639894535 +0000 UTC m=+3.961910493,LastTimestamp:2026-03-20 08:59:43.639894535 +0000 UTC m=+3.961910493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.926093 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a1f004a90 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.70032296 +0000 UTC m=+4.022338918,LastTimestamp:2026-03-20 08:59:43.70032296 +0000 UTC m=+4.022338918,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.930561 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a205c1ba0 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.723117472 +0000 UTC m=+4.045133430,LastTimestamp:2026-03-20 08:59:43.723117472 +0000 UTC m=+4.045133430,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.936211 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e810a0eb79c74\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a0eb79c74 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.42712434 +0000 UTC m=+3.749140288,LastTimestamp:2026-03-20 08:59:44.505555485 +0000 UTC m=+4.827571443,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.941392 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a4f27b681 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.508212865 +0000 UTC m=+4.830228813,LastTimestamp:2026-03-20 08:59:44.508212865 +0000 UTC m=+4.830228813,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.945635 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a5a0e4d37 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.691096887 +0000 UTC m=+5.013112845,LastTimestamp:2026-03-20 08:59:44.691096887 +0000 UTC m=+5.013112845,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.951995 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e810a1aabfd50\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a1aabfd50 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.627689296 +0000 UTC m=+3.949705254,LastTimestamp:2026-03-20 08:59:44.692024728 +0000 UTC m=+5.014040676,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.957439 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e810a1b663a07\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810a1b663a07 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:43.639894535 +0000 UTC m=+3.961910493,LastTimestamp:2026-03-20 08:59:44.708435326 +0000 UTC m=+5.030451304,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.962094 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a5b1ee11f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.708960543 +0000 UTC m=+5.030976511,LastTimestamp:2026-03-20 08:59:44.708960543 +0000 UTC m=+5.030976511,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.969785 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a5b3242f9 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.710230777 +0000 UTC m=+5.032246755,LastTimestamp:2026-03-20 08:59:44.710230777 +0000 UTC m=+5.032246755,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.975653 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a67f75b44 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.92447418 +0000 UTC m=+5.246490138,LastTimestamp:2026-03-20 08:59:44.92447418 +0000 UTC m=+5.246490138,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.980482 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a68ce67cf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.938567631 +0000 UTC m=+5.260583589,LastTimestamp:2026-03-20 08:59:44.938567631 +0000 UTC m=+5.260583589,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.986677 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a68e942b1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:44.940327601 +0000 UTC m=+5.262343559,LastTimestamp:2026-03-20 08:59:44.940327601 +0000 UTC m=+5.262343559,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.992742 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a75650073 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.149763699 +0000 UTC m=+5.471779657,LastTimestamp:2026-03-20 08:59:45.149763699 +0000 UTC m=+5.471779657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:45 crc kubenswrapper[4958]: E0320 09:00:45.997032 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a764501a3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.164444067 +0000 UTC m=+5.486460025,LastTimestamp:2026-03-20 08:59:45.164444067 +0000 UTC m=+5.486460025,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.000651 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a7659c639 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.165805113 +0000 UTC m=+5.487821071,LastTimestamp:2026-03-20 08:59:45.165805113 +0000 UTC m=+5.487821071,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.004397 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a82627651 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.367701073 +0000 UTC m=+5.689717031,LastTimestamp:2026-03-20 08:59:45.367701073 +0000 UTC m=+5.689717031,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.009324 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a830ded0a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.378938122 +0000 UTC m=+5.700954080,LastTimestamp:2026-03-20 08:59:45.378938122 +0000 UTC m=+5.700954080,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.012890 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a831f55e3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.380079075 +0000 UTC m=+5.702095033,LastTimestamp:2026-03-20 08:59:45.380079075 +0000 UTC m=+5.702095033,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.017619 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a8ec11c87 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.575230599 +0000 UTC m=+5.897246557,LastTimestamp:2026-03-20 08:59:45.575230599 +0000 UTC m=+5.897246557,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.021800 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e810a8f6bc4ae openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:45.586414766 +0000 UTC m=+5.908430724,LastTimestamp:2026-03-20 08:59:45.586414766 +0000 UTC m=+5.908430724,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.028434 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-controller-manager-crc.189e810b8a7420a9 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 09:00:46 crc kubenswrapper[4958]: body: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:49.798043817 +0000 UTC m=+10.120059775,LastTimestamp:2026-03-20 08:59:49.798043817 +0000 UTC m=+10.120059775,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.033622 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e810b8a751e42 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:49.798108738 +0000 UTC m=+10.120124696,LastTimestamp:2026-03-20 08:59:49.798108738 +0000 UTC m=+10.120124696,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.038409 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-apiserver-crc.189e810cde6df195 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 09:00:46 crc kubenswrapper[4958]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 09:00:46 crc kubenswrapper[4958]: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:55.501891989 +0000 UTC m=+15.823907947,LastTimestamp:2026-03-20 08:59:55.501891989 +0000 UTC m=+15.823907947,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.042296 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810cde6f0dbb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:55.501964731 +0000 UTC m=+15.823980689,LastTimestamp:2026-03-20 08:59:55.501964731 +0000 UTC m=+15.823980689,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.046569 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e810cde6df195\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-apiserver-crc.189e810cde6df195 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 09:00:46 crc kubenswrapper[4958]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 09:00:46 crc kubenswrapper[4958]: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:55.501891989 +0000 UTC m=+15.823907947,LastTimestamp:2026-03-20 08:59:55.507631475 +0000 UTC m=+15.829647443,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.050832 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e810cde6f0dbb\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e810cde6f0dbb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:55.501964731 +0000 UTC m=+15.823980689,LastTimestamp:2026-03-20 08:59:55.507693866 +0000 UTC m=+15.829709824,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.055685 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-controller-manager-crc.189e810dde827c58 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 09:00:46 crc kubenswrapper[4958]: body: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.798205528 +0000 UTC m=+20.120221526,LastTimestamp:2026-03-20 08:59:59.798205528 +0000 UTC m=+20.120221526,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.060590 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e810dde83987e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.79827827 +0000 UTC m=+20.120294258,LastTimestamp:2026-03-20 08:59:59.79827827 +0000 UTC m=+20.120294258,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.067083 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e810dde827c58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-controller-manager-crc.189e810dde827c58 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 09:00:46 crc kubenswrapper[4958]: body: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.798205528 +0000 UTC m=+20.120221526,LastTimestamp:2026-03-20 09:00:09.798677688 +0000 UTC m=+30.120693666,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.072368 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e810dde83987e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e810dde83987e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.79827827 +0000 UTC m=+20.120294258,LastTimestamp:2026-03-20 09:00:09.798752061 +0000 UTC m=+30.120768019,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.077961 4958 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e811032bcbbd3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 09:00:09.801243603 +0000 UTC m=+30.123259561,LastTimestamp:2026-03-20 09:00:09.801243603 +0000 UTC m=+30.123259561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.082939 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e81099fc48502\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e81099fc48502 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.56569933 +0000 UTC m=+1.887715288,LastTimestamp:2026-03-20 09:00:09.918997917 +0000 UTC m=+30.241013875,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.087424 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8109b1a24995\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109b1a24995 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.865445781 +0000 UTC m=+2.187461749,LastTimestamp:2026-03-20 09:00:10.081039143 +0000 UTC m=+30.403055101,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.091631 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8109b250a9b5\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8109b250a9b5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:41.876873653 +0000 UTC m=+2.198889611,LastTimestamp:2026-03-20 09:00:10.091919467 +0000 UTC m=+30.413935435,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.096958 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e810dde827c58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-controller-manager-crc.189e810dde827c58 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 09:00:46 crc kubenswrapper[4958]: body: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.798205528 +0000 UTC m=+20.120221526,LastTimestamp:2026-03-20 09:00:19.799070476 +0000 UTC m=+40.121086434,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.101454 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e810dde83987e\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e810dde83987e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.79827827 +0000 UTC m=+20.120294258,LastTimestamp:2026-03-20 09:00:19.799132168 +0000 UTC m=+40.121148126,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:00:46 crc kubenswrapper[4958]: E0320 09:00:46.106724 4958 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e810dde827c58\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 09:00:46 crc kubenswrapper[4958]: &Event{ObjectMeta:{kube-controller-manager-crc.189e810dde827c58 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 09:00:46 crc kubenswrapper[4958]: body: Mar 20 09:00:46 crc kubenswrapper[4958]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 08:59:59.798205528 +0000 UTC m=+20.120221526,LastTimestamp:2026-03-20 09:00:29.798944488 +0000 UTC m=+50.120960476,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 09:00:46 crc kubenswrapper[4958]: > Mar 20 09:00:46 crc kubenswrapper[4958]: I0320 09:00:46.351242 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:46 crc kubenswrapper[4958]: I0320 09:00:46.798620 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:46 crc kubenswrapper[4958]: I0320 09:00:46.798858 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:46 crc kubenswrapper[4958]: I0320 09:00:46.800170 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:46 crc kubenswrapper[4958]: I0320 09:00:46.800209 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:46 crc kubenswrapper[4958]: I0320 09:00:46.800220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:47 crc kubenswrapper[4958]: I0320 09:00:47.350018 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.351261 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.642405 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.642760 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.644416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.644462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.644477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:48 crc kubenswrapper[4958]: I0320 09:00:48.645399 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:00:48 crc kubenswrapper[4958]: E0320 09:00:48.645645 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.350448 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.621827 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.622049 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.622187 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.623478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.623528 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.623541 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.769762 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.770674 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.770711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:49 crc kubenswrapper[4958]: I0320 09:00:49.770725 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.090735 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.351771 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:50 crc kubenswrapper[4958]: E0320 09:00:50.495715 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.772623 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.773813 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.773883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.773899 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:50 crc kubenswrapper[4958]: E0320 09:00:50.940688 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.958759 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.960131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.960183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.960195 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:50 crc kubenswrapper[4958]: I0320 09:00:50.960225 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:50 crc kubenswrapper[4958]: E0320 09:00:50.966133 4958 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 09:00:51 crc kubenswrapper[4958]: I0320 09:00:51.350070 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:52 crc kubenswrapper[4958]: I0320 09:00:52.352381 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:52 crc kubenswrapper[4958]: I0320 09:00:52.525926 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 09:00:52 crc kubenswrapper[4958]: I0320 09:00:52.556582 4958 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 09:00:53 crc kubenswrapper[4958]: I0320 09:00:53.355563 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:54 crc kubenswrapper[4958]: I0320 09:00:54.350424 4958 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 09:00:55 crc kubenswrapper[4958]: I0320 09:00:55.262353 4958 csr.go:261] certificate signing request csr-twhnm is approved, waiting to be issued Mar 20 09:00:55 crc kubenswrapper[4958]: I0320 09:00:55.273754 4958 csr.go:257] certificate signing request csr-twhnm is issued Mar 20 09:00:55 crc kubenswrapper[4958]: I0320 09:00:55.318813 4958 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 09:00:56 crc kubenswrapper[4958]: I0320 09:00:56.171406 4958 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 09:00:56 crc kubenswrapper[4958]: I0320 09:00:56.275362 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-27 06:55:49.635720047 +0000 UTC Mar 20 09:00:56 crc kubenswrapper[4958]: I0320 09:00:56.275428 4958 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6765h54m53.360294827s for next certificate rotation Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.967249 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.968972 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.969024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.969044 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.969184 4958 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.981090 4958 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.981532 4958 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 09:00:57 crc kubenswrapper[4958]: E0320 09:00:57.981571 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.987551 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.987592 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.987622 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.987643 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:00:57 crc kubenswrapper[4958]: I0320 09:00:57.987657 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:00:57Z","lastTransitionTime":"2026-03-20T09:00:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.008784 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:57Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.017326 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.017384 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.017399 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.017420 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.017440 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:00:58Z","lastTransitionTime":"2026-03-20T09:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.029834 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.039354 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.039418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.039437 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.039477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.039495 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:00:58Z","lastTransitionTime":"2026-03-20T09:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.056308 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.068978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.069030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.069042 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.069068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:00:58 crc kubenswrapper[4958]: I0320 09:00:58.069079 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:00:58Z","lastTransitionTime":"2026-03-20T09:00:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.079711 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.079885 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.079919 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.180969 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.282087 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.382257 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.483202 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.583507 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.684701 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.785200 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.885390 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:58 crc kubenswrapper[4958]: E0320 09:00:58.986337 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.086872 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.188032 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.289138 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.390212 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.491255 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.592466 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.693505 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.793713 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.894248 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:00:59 crc kubenswrapper[4958]: E0320 09:00:59.995485 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.095996 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.096872 4958 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.196967 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.297212 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.397504 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.434534 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.436649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.436704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.436714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.437473 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.437706 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.496373 4958 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.498483 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.599039 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: I0320 09:01:00.609508 4958 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.699935 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.800723 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:00 crc kubenswrapper[4958]: E0320 09:01:00.901038 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.002166 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.102818 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.203885 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.304945 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.405289 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: I0320 09:01:01.434728 4958 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 09:01:01 crc kubenswrapper[4958]: I0320 09:01:01.436138 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:01 crc kubenswrapper[4958]: I0320 09:01:01.436192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:01 crc kubenswrapper[4958]: I0320 09:01:01.436207 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.506307 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.606534 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.706759 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.806870 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:01 crc kubenswrapper[4958]: E0320 09:01:01.907918 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.008616 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.109345 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.210282 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.310694 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.411092 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.511215 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.612216 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.713114 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.813487 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:02 crc kubenswrapper[4958]: E0320 09:01:02.914578 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.015298 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.116153 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.217241 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.318244 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.419356 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.520360 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.621527 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.722282 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.823439 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:03 crc kubenswrapper[4958]: E0320 09:01:03.924275 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.025296 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.126286 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.227658 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.328546 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.429130 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.530094 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: E0320 09:01:04.631199 4958 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.729811 4958 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.733691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.733762 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.733791 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.733827 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.733852 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:04Z","lastTransitionTime":"2026-03-20T09:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.836198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.836252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.836266 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.836286 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.836301 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:04Z","lastTransitionTime":"2026-03-20T09:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.939127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.939176 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.939189 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.939208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:04 crc kubenswrapper[4958]: I0320 09:01:04.939220 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:04Z","lastTransitionTime":"2026-03-20T09:01:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.042900 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.042979 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.042997 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.043024 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.043047 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.146032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.146091 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.146103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.146120 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.146133 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.249108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.249166 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.249182 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.249201 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.249216 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.352504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.352556 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.352566 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.352586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.352635 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.369846 4958 apiserver.go:52] "Watching apiserver" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.374985 4958 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.375759 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.376262 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.376276 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.376492 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.376552 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.376563 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.376301 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.376315 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.376831 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.376879 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.379077 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.379562 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.380089 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.380350 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.380532 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.381173 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.381356 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.382156 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.383699 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.410332 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.425270 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.440551 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.446875 4958 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.455016 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.455969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.456005 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.456017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.456034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.456047 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.470926 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.482925 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488279 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488361 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488410 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488439 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488484 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488509 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488533 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488582 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488650 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488674 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488724 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488752 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488798 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488823 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488889 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488928 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.488977 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489008 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489055 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489083 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489136 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489162 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489209 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489235 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489282 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489310 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489338 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489397 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489445 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489487 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489577 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489670 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489717 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489743 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489770 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.489998 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490054 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490080 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490106 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490151 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490175 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490225 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490260 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490319 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490346 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490394 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490421 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490507 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490535 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490559 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490615 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490645 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490672 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490698 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490745 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490772 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490797 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490821 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490847 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490891 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490959 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.490992 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491018 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491044 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491069 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491093 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491122 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491153 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491195 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491221 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491250 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491281 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491310 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491369 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491396 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491423 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491452 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491485 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491524 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491551 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491574 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491631 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491667 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491694 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491719 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491744 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491770 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491796 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491821 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491847 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491872 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491896 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491921 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491944 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491968 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.491994 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492049 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492071 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492094 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492118 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492143 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492165 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492206 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492237 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492276 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492307 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492332 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492357 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492381 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492405 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492432 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492457 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492483 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492507 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492535 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492562 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492588 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492633 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492660 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492687 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492714 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492742 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492767 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492792 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492818 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492844 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492873 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492898 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492931 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492965 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.492996 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493021 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493046 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493070 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493095 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493120 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493175 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493205 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493231 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493259 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493285 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493311 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493335 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493360 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493413 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493437 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493462 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493493 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493518 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493543 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493569 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493621 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493659 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493695 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493734 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493765 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493790 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493820 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.493849 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494051 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494083 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494108 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494135 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494162 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494188 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494214 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494241 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494272 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494297 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494327 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494366 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494423 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.494461 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495054 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495096 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495124 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495149 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495176 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495201 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495227 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495255 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495282 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495306 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495333 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495360 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495386 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495411 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495436 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495460 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495486 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495513 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495538 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495562 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495586 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495635 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495693 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495741 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495781 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495812 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495839 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495865 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495895 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495923 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495955 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.495985 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.496038 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.496065 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.496093 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.496123 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.498075 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.498309 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.498846 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.499187 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.499274 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.499465 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.499561 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500015 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500097 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500438 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500454 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500405 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500759 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500774 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.500805 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.501277 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.501476 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.501539 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.501683 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.501851 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.501983 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502026 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502230 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502310 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502471 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502532 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502627 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502933 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.502993 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.503015 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.503043 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.503278 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.503391 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.503657 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.503761 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.504168 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.504255 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.510564 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.510759 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.510766 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.510813 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.510671 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.510861 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.511206 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.511253 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.511188 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.511477 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.511869 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.512046 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.512102 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.512134 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.512392 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.512884 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.512947 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.513450 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.513461 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.513905 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514288 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514310 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514521 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514633 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514651 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514953 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.514983 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.515391 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.515569 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.515820 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.516299 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.516300 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.516438 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.516495 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.516864 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.516537 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517110 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517318 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517465 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517490 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517581 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517847 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517860 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.517906 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.518141 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.518480 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.518797 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.518983 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.518993 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519074 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519305 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519315 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519319 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519340 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.519536 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519614 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519534 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.519643 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:06.019611022 +0000 UTC m=+86.341626980 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519732 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.519889 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520015 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520185 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520261 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520693 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520895 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.520916 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.520954 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.520977 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520984 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521023 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.521082 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:06.021048603 +0000 UTC m=+86.343064561 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521263 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521400 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521439 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521459 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521503 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521539 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521727 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521888 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521961 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521903 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522127 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522263 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.522266 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.523810 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:06.023790093 +0000 UTC m=+86.345806051 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523126 4958 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523861 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524005 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524069 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524081 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524283 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524411 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524452 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524559 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524762 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.524810 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522341 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522373 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522517 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522712 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522747 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522932 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523079 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523144 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.521739 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523392 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523512 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.523537 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.525271 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.525322 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.525438 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:01:06.02541658 +0000 UTC m=+86.347432538 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541095 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541283 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541381 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541567 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541586 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541859 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.541961 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542055 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542077 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542167 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542255 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542293 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542560 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542565 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542623 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542789 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.542926 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.543034 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.543166 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520488 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.520453 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.543449 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.544125 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.544170 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.544227 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.544736 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.544772 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.544791 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:05 crc kubenswrapper[4958]: E0320 09:01:05.544864 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:06.044840282 +0000 UTC m=+86.366856240 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.545290 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.545425 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.545766 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.546030 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.546369 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.546439 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.546755 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.546886 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.547116 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.547237 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.547502 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.547723 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.547771 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.548152 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.548177 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.548178 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.548630 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.522311 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.548711 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.553532 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.553813 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.555216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.555528 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.548869 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.551701 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.555856 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.555994 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.556035 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.556162 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562465 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562477 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562495 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562508 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562745 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.562945 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.563473 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.566282 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.579294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.585742 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.586726 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.587184 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597445 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597536 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597641 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597667 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597660 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597689 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597682 4958 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597756 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597774 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597785 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597796 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597808 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597819 4958 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597829 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597841 4958 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597852 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597866 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597875 4958 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597886 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597897 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597907 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597918 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597929 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597939 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597949 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597959 4958 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597969 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597981 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.597993 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598006 4958 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598018 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598030 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598042 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598055 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598067 4958 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598078 4958 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598089 4958 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598099 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598109 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598129 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598140 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598149 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598159 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598170 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598180 4958 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598190 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598202 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598215 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598225 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598235 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598245 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598256 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598266 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598277 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598288 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598297 4958 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598308 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598318 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598328 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598339 4958 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598349 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598362 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598372 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598381 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598391 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598402 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598412 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598422 4958 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598432 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598444 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598455 4958 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598468 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598478 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598490 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598501 4958 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598511 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598522 4958 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598532 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598542 4958 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598552 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598561 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598571 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598580 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598590 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598612 4958 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598622 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598631 4958 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598642 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598651 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598661 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598671 4958 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598681 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598691 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598701 4958 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598711 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598720 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598731 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598741 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598752 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598761 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598771 4958 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598781 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598790 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598801 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598811 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598824 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598836 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598845 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598855 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598868 4958 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598879 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598889 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598899 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598908 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598920 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598931 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598942 4958 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598952 4958 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598962 4958 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598972 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598983 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.598992 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599001 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599010 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599019 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599030 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599039 4958 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599049 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599058 4958 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599067 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599076 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599086 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599095 4958 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599103 4958 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599112 4958 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599122 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599131 4958 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599140 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599150 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599175 4958 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599183 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599193 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599205 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599215 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599224 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599232 4958 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599241 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599251 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599259 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599268 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599278 4958 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599287 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599296 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599305 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599314 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599324 4958 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599333 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599343 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599351 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599360 4958 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599369 4958 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599377 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599386 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599395 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599403 4958 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599412 4958 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599422 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599430 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599439 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599449 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599458 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599466 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599475 4958 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599484 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599494 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599504 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599514 4958 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599523 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599532 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599543 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599552 4958 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599562 4958 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599572 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599581 4958 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599591 4958 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599802 4958 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599812 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599821 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599831 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599843 4958 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599853 4958 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599862 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599870 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599880 4958 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599890 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599900 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599909 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.599918 4958 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.665391 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.665429 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.665441 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.665457 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.665466 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.693852 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.700481 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.705783 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 09:01:05 crc kubenswrapper[4958]: W0320 09:01:05.716809 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-b1dafe885600abf2cc3476f5030f8bb33eb96e58ebdb7094877d2021b65511a9 WatchSource:0}: Error finding container b1dafe885600abf2cc3476f5030f8bb33eb96e58ebdb7094877d2021b65511a9: Status 404 returned error can't find the container with id b1dafe885600abf2cc3476f5030f8bb33eb96e58ebdb7094877d2021b65511a9 Mar 20 09:01:05 crc kubenswrapper[4958]: W0320 09:01:05.719187 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e281a57918d426c98b72d4a62b05ef60b70ac143157c88f761b4e363394361c0 WatchSource:0}: Error finding container e281a57918d426c98b72d4a62b05ef60b70ac143157c88f761b4e363394361c0: Status 404 returned error can't find the container with id e281a57918d426c98b72d4a62b05ef60b70ac143157c88f761b4e363394361c0 Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.769852 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.770171 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.770180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.770195 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.770206 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.814790 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e281a57918d426c98b72d4a62b05ef60b70ac143157c88f761b4e363394361c0"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.815761 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"b1dafe885600abf2cc3476f5030f8bb33eb96e58ebdb7094877d2021b65511a9"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.817990 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"0078a7c59e9f9a83ff1c53e6ce9e927a216cc9246265f5063481c0e6e9c2a337"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.873061 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.873103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.873113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.873130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.873141 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.977502 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.977555 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.977565 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.977583 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:05 crc kubenswrapper[4958]: I0320 09:01:05.977609 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:05Z","lastTransitionTime":"2026-03-20T09:01:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.080980 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.081026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.081038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.081054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.081065 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.104526 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.104659 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.104697 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.104727 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.104751 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.104815 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:01:07.104778898 +0000 UTC m=+87.426794856 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.104906 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.104928 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.104942 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105004 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:07.104980504 +0000 UTC m=+87.426996642 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105012 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105028 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105044 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105099 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:07.105087357 +0000 UTC m=+87.427103315 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105168 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105195 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:07.10518813 +0000 UTC m=+87.427204088 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105230 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: E0320 09:01:06.105252 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:07.105247121 +0000 UTC m=+87.427263079 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.183557 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.183624 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.183639 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.183658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.183674 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.291912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.291993 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.292011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.292041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.292060 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.394743 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.394811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.394826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.394854 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.394871 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.439183 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.439747 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.441279 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.441941 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.443082 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.443575 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.444171 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.445185 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.445921 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.447047 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.447563 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.448888 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.449419 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.449983 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.450914 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.451417 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.452535 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.452970 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.453528 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.454714 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.455333 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.457409 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.457950 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.459072 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.459614 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.460361 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.461544 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.462190 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.463114 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.463687 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.464627 4958 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.464741 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.466748 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.467394 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.467872 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.470359 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.471144 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.471774 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.472472 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.474390 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.474961 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.475572 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.476295 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.477018 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.477478 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.479333 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.479992 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.481181 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.481653 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.482527 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.483019 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.484056 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.484628 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.485077 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.496924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.496956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.496965 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.496978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.496990 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.600345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.600389 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.600403 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.600422 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.600436 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.703131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.703174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.703183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.703199 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.703212 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.806990 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.807040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.807055 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.807073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.807084 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.824035 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.824109 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.826804 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.845368 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.861274 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.876154 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.890243 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.904304 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.909484 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.909532 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.909550 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.909571 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.909585 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:06Z","lastTransitionTime":"2026-03-20T09:01:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.919758 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.938197 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.954006 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.972231 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:06 crc kubenswrapper[4958]: I0320 09:01:06.987498 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.001548 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:06Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.012942 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.012996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.013009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.013029 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.013042 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.017890 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:07Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.069517 4958 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.113191 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.113262 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.113289 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.113312 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.113333 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113390 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113412 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113433 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113445 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113471 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:09.113456162 +0000 UTC m=+89.435472120 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113538 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:09.113531594 +0000 UTC m=+89.435547552 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113539 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113554 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113636 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113652 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113678 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:09.113655087 +0000 UTC m=+89.435671045 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113696 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:09.113689558 +0000 UTC m=+89.435705506 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.113792 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:01:09.113773801 +0000 UTC m=+89.435789929 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.115647 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.115680 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.115689 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.115704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.115715 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.218346 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.218392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.218401 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.218419 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.218429 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.326713 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.326777 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.326789 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.326804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.326814 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.429778 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.429831 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.429841 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.429859 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.429870 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.434153 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.434173 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.434313 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.434441 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.434168 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:07 crc kubenswrapper[4958]: E0320 09:01:07.434573 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.532473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.533216 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.533230 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.533254 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.533268 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.638451 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.638499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.638509 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.638526 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.638541 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.742017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.742073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.742087 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.742109 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.742124 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.844650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.844719 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.844729 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.844747 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.844759 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.947538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.947615 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.947630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.947651 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:07 crc kubenswrapper[4958]: I0320 09:01:07.947666 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:07Z","lastTransitionTime":"2026-03-20T09:01:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.050408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.050467 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.050489 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.050504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.050516 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.153246 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.153309 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.153319 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.153334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.153344 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.256069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.256142 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.256152 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.256166 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.256177 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.358975 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.359038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.359049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.359069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.359080 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.401620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.401678 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.401691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.401711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.401727 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: E0320 09:01:08.419830 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.434174 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.434261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.434274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.434288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.434297 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: E0320 09:01:08.446411 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.452224 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.452286 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.452301 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.452326 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.452341 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: E0320 09:01:08.472975 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.494570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.494631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.494641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.494657 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.494668 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: E0320 09:01:08.527021 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.534340 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.534374 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.534383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.534397 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.534407 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: E0320 09:01:08.554611 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: E0320 09:01:08.554772 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.556685 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.556716 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.556730 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.556761 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.556773 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.659931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.659978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.659987 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.660007 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.660018 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.762341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.762402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.762414 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.762436 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.762452 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.834968 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.851458 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.864215 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.865305 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.865341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.865353 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.865372 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.865387 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.877765 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.890715 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.904822 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.917075 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:08Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.969045 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.969110 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.969123 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.969145 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:08 crc kubenswrapper[4958]: I0320 09:01:08.969161 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:08Z","lastTransitionTime":"2026-03-20T09:01:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.073775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.073817 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.073828 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.073844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.073858 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.136587 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.136703 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.136740 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.136788 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:01:13.136759902 +0000 UTC m=+93.458775870 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.136821 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.136861 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.136955 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.136998 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:13.136985248 +0000 UTC m=+93.459001206 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137011 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137039 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.136859 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137040 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137127 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137151 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137159 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:13.137109162 +0000 UTC m=+93.459125130 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137210 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:13.137191724 +0000 UTC m=+93.459207702 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137061 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.137266 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:13.137253356 +0000 UTC m=+93.459269334 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.176590 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.176676 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.176688 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.176703 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.176714 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.280114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.280190 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.280202 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.280222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.280556 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.383874 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.383953 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.383970 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.384011 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.384028 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.434305 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.434443 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.434535 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.434555 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.434768 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:09 crc kubenswrapper[4958]: E0320 09:01:09.434885 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.487871 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.487925 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.487936 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.487957 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.487971 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.591263 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.591325 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.591346 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.591364 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.591377 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.694175 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.694233 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.694245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.694266 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.694280 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.797478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.797549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.797565 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.797586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.797615 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.900486 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.900554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.900571 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.900633 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:09 crc kubenswrapper[4958]: I0320 09:01:09.900653 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:09Z","lastTransitionTime":"2026-03-20T09:01:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.003734 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.004008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.004026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.004078 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.004089 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.107236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.107288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.107303 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.107323 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.107337 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.210803 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.210891 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.210914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.210944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.210969 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.314032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.314081 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.314091 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.314108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.314119 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.417697 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.417755 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.417769 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.417787 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.417801 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.450087 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.467152 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.478709 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.493885 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.511356 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.520459 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.520511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.520521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.520536 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.520547 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.529317 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.623568 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.623618 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.623631 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.623646 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.623659 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.726645 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.726710 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.726724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.726747 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.726765 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.831587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.831660 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.831668 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.831689 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.831698 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.935083 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.935136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.935146 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.935165 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:10 crc kubenswrapper[4958]: I0320 09:01:10.935633 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:10Z","lastTransitionTime":"2026-03-20T09:01:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.038421 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.038493 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.038504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.038520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.038529 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.142274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.142355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.142374 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.142402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.142419 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.244742 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.244804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.244815 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.244834 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.244846 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.349908 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.349950 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.349960 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.349976 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.349986 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.434392 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.434552 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:11 crc kubenswrapper[4958]: E0320 09:01:11.434660 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.434436 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:11 crc kubenswrapper[4958]: E0320 09:01:11.434785 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:11 crc kubenswrapper[4958]: E0320 09:01:11.434988 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.452753 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.452807 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.452820 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.452839 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.452855 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.555393 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.555456 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.555472 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.555492 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.555505 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.658724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.658785 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.658799 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.658824 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.658838 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.760893 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.760947 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.760961 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.760982 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.760995 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.863977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.864040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.864052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.864070 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.864085 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.967245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.967306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.967326 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.967349 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:11 crc kubenswrapper[4958]: I0320 09:01:11.967363 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:11Z","lastTransitionTime":"2026-03-20T09:01:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.070267 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.070317 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.070327 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.070348 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.070359 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.173954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.174025 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.174035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.174054 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.174065 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.277276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.277341 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.277355 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.277381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.277397 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.379505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.379575 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.379588 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.379636 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.379662 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.482347 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.482390 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.482403 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.482424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.482439 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.585351 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.585396 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.585407 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.585424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.585435 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.688452 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.688506 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.688515 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.688532 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.688545 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.790523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.790565 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.790577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.790619 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.790631 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.893522 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.893610 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.893635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.893657 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.893671 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.996862 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.996899 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.996909 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.996924 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:12 crc kubenswrapper[4958]: I0320 09:01:12.996937 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:12Z","lastTransitionTime":"2026-03-20T09:01:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.099192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.099241 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.099258 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.099277 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.099290 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.172293 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.172393 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.172423 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.172449 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.172479 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172543 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:01:21.172508617 +0000 UTC m=+101.494524585 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172564 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172607 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172659 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:21.172640622 +0000 UTC m=+101.494656570 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172674 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:21.172668713 +0000 UTC m=+101.494684671 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172730 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172749 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172755 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172769 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172777 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172783 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172826 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:21.172814957 +0000 UTC m=+101.494830915 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.172845 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:21.172837857 +0000 UTC m=+101.494853815 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.202220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.202266 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.202277 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.202297 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.202309 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.304418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.304470 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.304481 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.304502 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.304516 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.407088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.407143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.407158 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.407178 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.407190 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.434695 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.434779 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.434696 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.434881 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.434977 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:13 crc kubenswrapper[4958]: E0320 09:01:13.435085 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.510685 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.510745 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.510766 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.510787 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.510800 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.613705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.613768 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.613859 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.613881 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.613895 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.716937 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.717008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.717027 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.717062 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.717084 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.820635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.820704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.820724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.820751 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.820769 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.924031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.924083 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.924109 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.924128 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:13 crc kubenswrapper[4958]: I0320 09:01:13.924142 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:13Z","lastTransitionTime":"2026-03-20T09:01:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.026620 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.026668 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.026679 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.026697 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.026708 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.129041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.129113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.129125 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.129146 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.129159 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.235978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.236018 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.236028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.236043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.236053 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.339177 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.339236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.339254 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.339278 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.339292 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.442733 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.442805 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.442820 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.442844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.442862 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.545854 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.545901 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.545911 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.545928 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.545940 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.648431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.648472 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.648480 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.648495 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.648506 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.752383 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.752431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.752450 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.752470 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.752483 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.855415 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.855463 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.855473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.855491 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.855503 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.958331 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.958376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.958387 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.958405 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:14 crc kubenswrapper[4958]: I0320 09:01:14.958420 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:14Z","lastTransitionTime":"2026-03-20T09:01:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.061539 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.061590 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.061623 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.061643 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.061653 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.164649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.164696 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.164706 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.164724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.164736 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.267154 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.267207 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.267219 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.267247 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.267260 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.369987 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.370041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.370052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.370070 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.370084 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.434201 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.434270 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.434346 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:15 crc kubenswrapper[4958]: E0320 09:01:15.434514 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:15 crc kubenswrapper[4958]: E0320 09:01:15.434669 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:15 crc kubenswrapper[4958]: E0320 09:01:15.434940 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.449171 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.449183 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:01:15 crc kubenswrapper[4958]: E0320 09:01:15.449471 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.450227 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.472942 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.473000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.473010 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.473028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.473039 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.575386 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.575431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.575445 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.575464 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.575478 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.677642 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.677707 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.677719 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.677746 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.677759 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.779543 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.779615 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.779624 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.779640 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.779650 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.855629 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:01:15 crc kubenswrapper[4958]: E0320 09:01:15.855824 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.882841 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.882897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.882909 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.882929 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.882942 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.986108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.986172 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.986185 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.986206 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:15 crc kubenswrapper[4958]: I0320 09:01:15.986220 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:15Z","lastTransitionTime":"2026-03-20T09:01:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.088826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.088897 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.088916 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.088940 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.088961 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.192832 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.192926 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.192951 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.192985 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.193010 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.296000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.296057 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.296070 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.296088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.296102 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.398558 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.398683 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.398721 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.398757 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.398782 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.501468 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.501560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.501578 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.501636 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.501659 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.605158 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.605233 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.605253 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.605285 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.605304 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.708862 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.708918 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.708935 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.708958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.708975 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.811714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.811775 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.811790 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.811811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.811825 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.914254 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.914306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.914321 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.914340 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:16 crc kubenswrapper[4958]: I0320 09:01:16.914351 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:16Z","lastTransitionTime":"2026-03-20T09:01:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.017428 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.017476 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.017487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.017501 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.017511 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.120576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.120641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.120653 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.120671 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.120683 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.223215 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.223256 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.223267 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.223283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.223294 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.326356 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.326418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.326433 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.326452 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.326468 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.429332 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.429388 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.429402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.429424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.429442 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.434552 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:17 crc kubenswrapper[4958]: E0320 09:01:17.434714 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.434739 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.434786 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:17 crc kubenswrapper[4958]: E0320 09:01:17.434830 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:17 crc kubenswrapper[4958]: E0320 09:01:17.434910 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.532020 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.532084 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.532097 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.532116 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.532129 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.635418 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.635473 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.635483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.635503 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.635523 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.738537 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.738586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.738616 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.738633 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.738644 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.841821 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.841905 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.841915 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.841931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.841942 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.944817 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.944879 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.944892 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.944916 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:17 crc kubenswrapper[4958]: I0320 09:01:17.944929 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:17Z","lastTransitionTime":"2026-03-20T09:01:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.047890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.047963 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.047979 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.048000 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.048014 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.154171 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.154248 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.154264 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.154651 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.154671 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.257504 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.257567 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.257585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.257648 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.257671 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.361105 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.361151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.361163 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.361183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.361194 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.464043 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.464112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.464135 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.464166 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.464208 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.568374 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.568458 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.568482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.568514 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.568537 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.583649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.583715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.583732 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.583749 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.583795 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: E0320 09:01:18.600994 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:18Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.607134 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.607183 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.607195 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.607217 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.607230 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: E0320 09:01:18.625232 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:18Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.629625 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.629673 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.629686 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.629706 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.629722 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: E0320 09:01:18.642415 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:18Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.646931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.646969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.646981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.647002 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.647305 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: E0320 09:01:18.658905 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:18Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.663315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.663350 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.663361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.663381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.663395 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: E0320 09:01:18.676787 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:18Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:18 crc kubenswrapper[4958]: E0320 09:01:18.676908 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.678471 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.678501 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.678512 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.678528 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.678543 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.781291 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.781361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.781381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.781408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.781428 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.884312 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.884376 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.884386 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.884409 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.884421 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.987912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.988007 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.988034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.988068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:18 crc kubenswrapper[4958]: I0320 09:01:18.988092 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:18Z","lastTransitionTime":"2026-03-20T09:01:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.091526 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.091563 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.091574 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.091589 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.091614 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.195494 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.195553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.195564 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.195582 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.196036 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.299252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.299319 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.299328 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.299363 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.299376 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.402484 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.402554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.402569 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.402623 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.402639 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.434221 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.434285 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:19 crc kubenswrapper[4958]: E0320 09:01:19.434391 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.434427 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:19 crc kubenswrapper[4958]: E0320 09:01:19.434763 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:19 crc kubenswrapper[4958]: E0320 09:01:19.434989 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.505221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.505282 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.505296 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.505314 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.505342 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.608678 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.608740 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.608754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.608773 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.608786 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.711819 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.711871 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.711880 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.711895 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.711904 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.814398 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.814465 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.814479 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.814523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.814544 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.918017 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.918097 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.918118 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.918143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:19 crc kubenswrapper[4958]: I0320 09:01:19.918165 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:19Z","lastTransitionTime":"2026-03-20T09:01:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.021382 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.021450 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.021475 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.021503 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.021521 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.124032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.124074 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.124086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.124103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.124115 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.228022 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.228114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.228140 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.228179 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.228204 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.331735 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.331796 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.331808 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.331830 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.331850 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.433943 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.434425 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.434668 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.434852 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.434996 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.453716 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.473222 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.496265 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.530654 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.538047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.538095 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.538108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.538130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.538144 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.561791 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.580003 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.599049 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.618213 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:20Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.641424 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.641485 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.641499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.641518 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.641530 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.744517 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.745118 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.745317 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.745493 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.745667 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.848554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.848638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.848654 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.848713 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.848730 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.952108 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.952586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.952862 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.953082 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:20 crc kubenswrapper[4958]: I0320 09:01:20.953275 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:20Z","lastTransitionTime":"2026-03-20T09:01:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.060193 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.060584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.060738 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.060831 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.060911 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.164401 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.164475 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.164488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.164507 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.164520 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.260363 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.260454 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.260489 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.260514 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.260533 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260749 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260758 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260797 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260810 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260816 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260828 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260803 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260895 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:37.260856936 +0000 UTC m=+117.582872894 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260915 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:37.260907048 +0000 UTC m=+117.582923006 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260932 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:37.260922838 +0000 UTC m=+117.582938796 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260951 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:01:37.260940239 +0000 UTC m=+117.582956197 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.260977 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.261150 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:37.261113164 +0000 UTC m=+117.583129162 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.267456 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.267511 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.267525 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.267549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.267563 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.370521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.370673 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.370715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.370758 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.370786 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.434850 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.434899 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.435066 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.434899 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.435293 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:21 crc kubenswrapper[4958]: E0320 09:01:21.435400 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.473945 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.473995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.474005 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.474020 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.474031 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.577635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.577693 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.577710 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.577738 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.577758 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.681201 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.681260 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.681271 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.681289 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.681301 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.784540 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.784584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.784607 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.784647 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.784662 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.886843 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.886889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.886901 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.886917 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.886929 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.989736 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.989789 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.989799 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.989817 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:21 crc kubenswrapper[4958]: I0320 09:01:21.989829 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:21Z","lastTransitionTime":"2026-03-20T09:01:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.093152 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.093215 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.093232 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.093262 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.093281 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.200814 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.200880 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.200890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.200907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.200918 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.303912 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.304329 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.304401 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.304478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.304544 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.407517 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.407580 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.407615 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.407642 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.407655 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.509810 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.509855 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.509866 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.509882 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.509893 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.617648 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.618349 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.618433 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.618505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.618569 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.722041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.722092 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.722130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.722156 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.722171 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.824943 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.825243 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.825357 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.825432 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.825492 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.927577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.927699 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.927731 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.927760 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:22 crc kubenswrapper[4958]: I0320 09:01:22.927779 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:22Z","lastTransitionTime":"2026-03-20T09:01:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.031067 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.031123 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.031136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.031155 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.031168 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.133844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.133896 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.133907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.133926 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.133939 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.236888 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.236952 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.236969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.236995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.237015 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.340410 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.340510 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.340528 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.340553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.340571 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.434410 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.434490 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:23 crc kubenswrapper[4958]: E0320 09:01:23.434608 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.434658 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:23 crc kubenswrapper[4958]: E0320 09:01:23.434729 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:23 crc kubenswrapper[4958]: E0320 09:01:23.434856 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.443334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.443404 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.443416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.443436 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.443449 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.546196 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.546258 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.546270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.546288 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.546299 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.649496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.649544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.649555 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.649573 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.649585 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.752907 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.752978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.752992 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.753015 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.753029 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.855752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.855802 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.855812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.855830 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.855841 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.958260 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.958527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.958579 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.958649 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:23 crc kubenswrapper[4958]: I0320 09:01:23.958671 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:23Z","lastTransitionTime":"2026-03-20T09:01:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.061375 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.061421 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.061434 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.061453 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.061463 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.164514 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.164553 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.164564 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.164577 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.164590 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.267582 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.267725 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.267830 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.267857 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.267873 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.371687 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.371740 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.371752 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.371772 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.371786 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.474522 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.474658 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.474676 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.474697 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.474714 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.578113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.578179 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.578192 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.578213 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.578228 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.681468 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.681519 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.681532 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.681552 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.681565 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.784555 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.784643 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.784661 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.784684 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.784701 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.886984 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.887036 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.887047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.887063 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.887074 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.989989 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.990035 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.990049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.990067 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:24 crc kubenswrapper[4958]: I0320 09:01:24.990082 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:24Z","lastTransitionTime":"2026-03-20T09:01:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.093476 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.093521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.093531 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.093586 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.093624 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.195976 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.196032 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.196042 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.196057 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.196069 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.298958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.299028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.299052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.299131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.299185 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.403434 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.403505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.403523 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.403550 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.403568 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.434097 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:25 crc kubenswrapper[4958]: E0320 09:01:25.434306 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.434827 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:25 crc kubenswrapper[4958]: E0320 09:01:25.435143 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.434921 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:25 crc kubenswrapper[4958]: E0320 09:01:25.435406 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.506214 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.506263 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.506275 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.506290 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.506300 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.609415 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.609463 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.609472 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.609487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.609497 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.712381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.712444 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.712460 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.712482 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.712503 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.785108 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-p2twx"] Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.785622 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.788396 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.789267 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.789470 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.803746 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.815479 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.815549 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.815565 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.815591 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.815641 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.821630 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.862994 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.884029 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.902327 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.904826 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w76jx\" (UniqueName: \"kubernetes.io/projected/fd744235-23b7-408d-958b-90a9219c6fd1-kube-api-access-w76jx\") pod \"node-resolver-p2twx\" (UID: \"fd744235-23b7-408d-958b-90a9219c6fd1\") " pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.904876 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd744235-23b7-408d-958b-90a9219c6fd1-hosts-file\") pod \"node-resolver-p2twx\" (UID: \"fd744235-23b7-408d-958b-90a9219c6fd1\") " pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.913801 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.917587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.917635 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.917648 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.917664 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.917675 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:25Z","lastTransitionTime":"2026-03-20T09:01:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.925920 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.938479 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:25 crc kubenswrapper[4958]: I0320 09:01:25.951918 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:25Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.005747 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w76jx\" (UniqueName: \"kubernetes.io/projected/fd744235-23b7-408d-958b-90a9219c6fd1-kube-api-access-w76jx\") pod \"node-resolver-p2twx\" (UID: \"fd744235-23b7-408d-958b-90a9219c6fd1\") " pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.005825 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd744235-23b7-408d-958b-90a9219c6fd1-hosts-file\") pod \"node-resolver-p2twx\" (UID: \"fd744235-23b7-408d-958b-90a9219c6fd1\") " pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.006031 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fd744235-23b7-408d-958b-90a9219c6fd1-hosts-file\") pod \"node-resolver-p2twx\" (UID: \"fd744235-23b7-408d-958b-90a9219c6fd1\") " pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.020099 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.020136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.020145 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.020162 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.020171 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.026234 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w76jx\" (UniqueName: \"kubernetes.io/projected/fd744235-23b7-408d-958b-90a9219c6fd1-kube-api-access-w76jx\") pod \"node-resolver-p2twx\" (UID: \"fd744235-23b7-408d-958b-90a9219c6fd1\") " pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.104071 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-p2twx" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.124362 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.124416 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.124426 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.124443 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.124455 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.187121 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-lht4x"] Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.187649 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-kvsdf"] Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.187993 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.188072 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-wjb45"] Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.188399 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.189868 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.191108 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.191117 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.191494 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.192236 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.192894 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.193318 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.193553 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.195298 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.195576 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.195650 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.195969 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.195999 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.207883 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-hostroot\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.207926 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxv5q\" (UniqueName: \"kubernetes.io/projected/1479666a-d3f9-47dc-aa36-45cc7425d7ee-kube-api-access-zxv5q\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.207947 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-k8s-cni-cncf-io\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.207963 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-netns\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.207980 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vgtvt\" (UniqueName: \"kubernetes.io/projected/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-kube-api-access-vgtvt\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208011 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-cnibin\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208032 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-cni-bin\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31474b1f-5bf9-4201-95c2-864df0fed1d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208071 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-os-release\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208089 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-cni-multus\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208105 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-daemon-config\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208121 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-etc-kubernetes\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208280 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-system-cni-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208334 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-cni-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208370 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1479666a-d3f9-47dc-aa36-45cc7425d7ee-cni-binary-copy\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208439 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-proxy-tls\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208496 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-os-release\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208557 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-multus-certs\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208581 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-conf-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208615 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-system-cni-dir\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208671 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-cnibin\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208713 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-rootfs\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208765 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-socket-dir-parent\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-kubelet\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.208820 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.212959 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.226333 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.230904 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.230955 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.230968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.230992 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.231007 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.244455 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.256536 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.279408 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.295120 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.308105 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309190 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309245 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309271 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-hostroot\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309294 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxv5q\" (UniqueName: \"kubernetes.io/projected/1479666a-d3f9-47dc-aa36-45cc7425d7ee-kube-api-access-zxv5q\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309367 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-hostroot\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309535 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-k8s-cni-cncf-io\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309731 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-k8s-cni-cncf-io\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309805 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-netns\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309911 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-netns\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309954 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vgtvt\" (UniqueName: \"kubernetes.io/projected/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-kube-api-access-vgtvt\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.309988 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-cnibin\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310223 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-cni-bin\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310244 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31474b1f-5bf9-4201-95c2-864df0fed1d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310256 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-cnibin\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310304 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-cni-bin\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310337 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26mbm\" (UniqueName: \"kubernetes.io/projected/31474b1f-5bf9-4201-95c2-864df0fed1d0-kube-api-access-26mbm\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310364 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-os-release\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310564 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-mcd-auth-proxy-config\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310852 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-cni-multus\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310954 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-cni-multus\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.310999 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-daemon-config\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311037 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-os-release\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311143 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-etc-kubernetes\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311203 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-etc-kubernetes\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311278 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-system-cni-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311321 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-cni-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311342 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1479666a-d3f9-47dc-aa36-45cc7425d7ee-cni-binary-copy\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311371 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-proxy-tls\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311395 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-os-release\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311424 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-multus-certs\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311446 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-conf-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311466 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-system-cni-dir\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311489 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-cnibin\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311513 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-rootfs\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-socket-dir-parent\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311569 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-kubelet\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311615 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31474b1f-5bf9-4201-95c2-864df0fed1d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311705 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-rootfs\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311370 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-system-cni-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311672 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-os-release\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311770 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-socket-dir-parent\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-var-lib-kubelet\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311814 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-host-run-multus-certs\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.311892 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-system-cni-dir\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.312009 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-conf-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.312220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-cni-dir\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.312544 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-cnibin\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.313207 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/1479666a-d3f9-47dc-aa36-45cc7425d7ee-multus-daemon-config\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.313212 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/31474b1f-5bf9-4201-95c2-864df0fed1d0-cni-binary-copy\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.316855 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-proxy-tls\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.323179 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/1479666a-d3f9-47dc-aa36-45cc7425d7ee-cni-binary-copy\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.324690 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.325951 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxv5q\" (UniqueName: \"kubernetes.io/projected/1479666a-d3f9-47dc-aa36-45cc7425d7ee-kube-api-access-zxv5q\") pod \"multus-lht4x\" (UID: \"1479666a-d3f9-47dc-aa36-45cc7425d7ee\") " pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.325954 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vgtvt\" (UniqueName: \"kubernetes.io/projected/d3bb0dff-98a7-4359-841f-5fb469ebc3f4-kube-api-access-vgtvt\") pod \"machine-config-daemon-kvsdf\" (UID: \"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\") " pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.333112 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.333146 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.333157 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.333175 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.333187 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.339402 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.352696 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.369497 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.383786 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.399629 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.411505 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.411974 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31474b1f-5bf9-4201-95c2-864df0fed1d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.412017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.412043 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-26mbm\" (UniqueName: \"kubernetes.io/projected/31474b1f-5bf9-4201-95c2-864df0fed1d0-kube-api-access-26mbm\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.412918 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/31474b1f-5bf9-4201-95c2-864df0fed1d0-tuning-conf-dir\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.412956 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/31474b1f-5bf9-4201-95c2-864df0fed1d0-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.420895 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.428669 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-26mbm\" (UniqueName: \"kubernetes.io/projected/31474b1f-5bf9-4201-95c2-864df0fed1d0-kube-api-access-26mbm\") pod \"multus-additional-cni-plugins-wjb45\" (UID: \"31474b1f-5bf9-4201-95c2-864df0fed1d0\") " pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.431900 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.435207 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.435243 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.435252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.435267 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.435277 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.445373 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.465043 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.478015 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.490252 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.502474 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.518241 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lht4x" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.518205 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.520236 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.527080 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-wjb45" Mar 20 09:01:26 crc kubenswrapper[4958]: W0320 09:01:26.532280 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1479666a_d3f9_47dc_aa36_45cc7425d7ee.slice/crio-07b780aceb024aa7525ff9b26658780e783538a7da9e293c77913d0b35a70bfe WatchSource:0}: Error finding container 07b780aceb024aa7525ff9b26658780e783538a7da9e293c77913d0b35a70bfe: Status 404 returned error can't find the container with id 07b780aceb024aa7525ff9b26658780e783538a7da9e293c77913d0b35a70bfe Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.537711 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.537778 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.537798 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.537826 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.537844 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.563834 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmjtz"] Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.564801 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.574265 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.574900 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.574906 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.574959 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.575160 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.575330 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.575810 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.593699 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.612698 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.614185 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-netns\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.614961 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.615108 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-node-log\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.615920 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-log-socket\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616045 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-bin\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616143 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-env-overrides\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616240 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616390 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-systemd\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616503 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-systemd-units\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616640 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpg76\" (UniqueName: \"kubernetes.io/projected/eb4de400-dc39-4926-8311-279b913e5871-kube-api-access-gpg76\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.616924 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-netd\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617013 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-var-lib-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617119 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-script-lib\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617200 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-config\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617267 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-etc-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617338 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-kubelet\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617402 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617468 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-slash\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617632 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-ovn\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.617724 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb4de400-dc39-4926-8311-279b913e5871-ovn-node-metrics-cert\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.631701 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.640344 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.640379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.640392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.640413 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.640426 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.647870 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.660743 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.686157 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.703399 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.717129 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-netns\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718498 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718516 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-bin\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718533 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-env-overrides\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-node-log\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718565 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-log-socket\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718582 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718615 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-systemd\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718633 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-systemd-units\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718650 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpg76\" (UniqueName: \"kubernetes.io/projected/eb4de400-dc39-4926-8311-279b913e5871-kube-api-access-gpg76\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718668 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-netd\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718698 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-script-lib\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718716 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-var-lib-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718734 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-config\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718763 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-etc-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718781 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-kubelet\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718798 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718825 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-slash\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718841 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-ovn\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.718858 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb4de400-dc39-4926-8311-279b913e5871-ovn-node-metrics-cert\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719437 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-ovn-kubernetes\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-node-log\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719516 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-etc-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719576 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-netd\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719641 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-netns\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719677 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-bin\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720040 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-systemd\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720074 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-systemd-units\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720098 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.719494 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-log-socket\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720137 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-var-lib-openvswitch\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720225 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-ovn\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720222 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-slash\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720265 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-kubelet\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720420 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-env-overrides\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720635 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-script-lib\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.720876 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-config\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.727761 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb4de400-dc39-4926-8311-279b913e5871-ovn-node-metrics-cert\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.730155 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.744329 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.744366 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.744377 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.744392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.744405 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.745821 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpg76\" (UniqueName: \"kubernetes.io/projected/eb4de400-dc39-4926-8311-279b913e5871-kube-api-access-gpg76\") pod \"ovnkube-node-tmjtz\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.748128 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.776446 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.793239 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.810357 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.853571 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.853638 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.853652 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.853672 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.853689 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.887381 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lht4x" event={"ID":"1479666a-d3f9-47dc-aa36-45cc7425d7ee","Type":"ContainerStarted","Data":"b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.887458 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lht4x" event={"ID":"1479666a-d3f9-47dc-aa36-45cc7425d7ee","Type":"ContainerStarted","Data":"07b780aceb024aa7525ff9b26658780e783538a7da9e293c77913d0b35a70bfe"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.890166 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p2twx" event={"ID":"fd744235-23b7-408d-958b-90a9219c6fd1","Type":"ContainerStarted","Data":"e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.890244 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-p2twx" event={"ID":"fd744235-23b7-408d-958b-90a9219c6fd1","Type":"ContainerStarted","Data":"231296fc2027641228759643118f360773788cdbb7d6b791eb7fbabe5c13dc0e"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.893038 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerStarted","Data":"87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.893091 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerStarted","Data":"ce015c4cf201f65184ab4960cd31caf07ece0fd6bdfedab54c7b71ddf58479f5"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.898375 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.898412 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.898427 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"ef7ff8473c34ae07d12b265b65b5397041ae8fec279d87fa7327e1518c3fd4cd"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.907313 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.921026 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.926096 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: W0320 09:01:26.937742 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb4de400_dc39_4926_8311_279b913e5871.slice/crio-e1d4a03bf8affed2ba168af7dff8dc9fe51eb5be068bd9fa84b35e70a3eeffd6 WatchSource:0}: Error finding container e1d4a03bf8affed2ba168af7dff8dc9fe51eb5be068bd9fa84b35e70a3eeffd6: Status 404 returned error can't find the container with id e1d4a03bf8affed2ba168af7dff8dc9fe51eb5be068bd9fa84b35e70a3eeffd6 Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.940681 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.957276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.957330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.957342 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.957361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.957376 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:26Z","lastTransitionTime":"2026-03-20T09:01:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.970009 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:26 crc kubenswrapper[4958]: I0320 09:01:26.985794 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.005246 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:26Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.020779 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.034571 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.053536 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.066185 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.066234 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.066252 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.066280 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.066305 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.082572 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.099439 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.124458 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.149443 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.165381 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.169495 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.169528 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.169538 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.169554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.169567 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.181360 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.209754 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.225295 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.238559 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.260154 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.271947 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.271981 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.271991 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.272006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.272016 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.283820 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.300760 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.316106 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.329482 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.343700 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.355459 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.370272 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.373999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.374034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.374047 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.374065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.374075 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.434162 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.434243 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:27 crc kubenswrapper[4958]: E0320 09:01:27.434326 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:27 crc kubenswrapper[4958]: E0320 09:01:27.434412 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.434522 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:27 crc kubenswrapper[4958]: E0320 09:01:27.434627 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.477330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.477392 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.477410 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.477435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.477455 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.580621 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.580671 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.580686 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.580705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.580720 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.688928 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.688969 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.688979 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.688995 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.689004 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.792096 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.792157 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.792168 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.792189 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.792238 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.895182 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.895235 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.895256 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.895277 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.895292 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.911276 4958 generic.go:334] "Generic (PLEG): container finished" podID="31474b1f-5bf9-4201-95c2-864df0fed1d0" containerID="87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3" exitCode=0 Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.911370 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerDied","Data":"87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.914850 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" exitCode=0 Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.914907 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.914932 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"e1d4a03bf8affed2ba168af7dff8dc9fe51eb5be068bd9fa84b35e70a3eeffd6"} Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.930257 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.954531 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.982612 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.997843 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.997891 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.997899 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.997913 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:27 crc kubenswrapper[4958]: I0320 09:01:27.997924 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:27Z","lastTransitionTime":"2026-03-20T09:01:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:27.999896 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:27Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.028166 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.045015 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.065246 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.079734 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.093954 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.100931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.100964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.100974 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.100994 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.101006 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.107327 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.126218 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.140430 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.157420 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.171814 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.185241 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.204978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.205039 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.205049 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.204963 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.205068 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.205081 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.223368 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.237781 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.249338 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.260084 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.271239 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.282246 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.298246 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.307837 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.307893 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.307905 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.307925 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.307938 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.316972 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.330558 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.349769 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.410623 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.410678 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.410693 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.410714 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.410725 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.435228 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.513792 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.514156 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.514177 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.514205 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.514225 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.617411 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.617455 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.617467 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.617484 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.617496 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.720976 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.721009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.721019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.721036 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.721049 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.824128 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.824439 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.824447 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.824507 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.824519 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.896647 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.896680 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.896690 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.896705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.896715 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: E0320 09:01:28.919712 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.925331 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.926590 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.926956 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.931894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.932161 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.932226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.932297 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.932362 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.936641 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.936850 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.936939 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.937032 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.937114 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.942389 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerStarted","Data":"ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.944319 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: E0320 09:01:28.948546 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.954038 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.954075 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.954086 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.954103 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.954131 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.964026 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: E0320 09:01:28.967245 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.971384 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.971425 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.971434 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.971451 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.971463 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.980685 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: E0320 09:01:28.985675 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.989327 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.989356 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.989365 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.989382 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.989395 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:28Z","lastTransitionTime":"2026-03-20T09:01:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:28 crc kubenswrapper[4958]: I0320 09:01:28.991866 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: E0320 09:01:29.001179 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:28Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: E0320 09:01:29.001354 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.003113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.003151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.003167 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.003188 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.003204 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.003977 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.018402 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.044889 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.063167 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.084080 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.101548 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.105204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.105251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.105263 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.105283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.105295 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.113943 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.125677 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.138095 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.151936 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.166083 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.189800 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.202990 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.207629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.207669 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.207677 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.207695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.207706 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.217666 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.236417 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.250550 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.267180 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.278957 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.289139 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.300514 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.310780 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.310844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.310859 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.310883 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.310900 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.312521 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.328166 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.413889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.413939 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.413951 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.413974 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.413987 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.433954 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.434011 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:29 crc kubenswrapper[4958]: E0320 09:01:29.434199 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.434274 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:29 crc kubenswrapper[4958]: E0320 09:01:29.434480 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:29 crc kubenswrapper[4958]: E0320 09:01:29.434630 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.516719 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.516764 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.516776 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.516794 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.516805 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.619914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.619977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.619988 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.620006 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.620018 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.723219 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.723272 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.723289 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.723312 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.723329 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.825784 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.825863 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.825886 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.825919 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.825943 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.928780 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.928891 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.928902 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.928920 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.928931 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:29Z","lastTransitionTime":"2026-03-20T09:01:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.948287 4958 generic.go:334] "Generic (PLEG): container finished" podID="31474b1f-5bf9-4201-95c2-864df0fed1d0" containerID="ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377" exitCode=0 Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.948390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerDied","Data":"ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.954755 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7"} Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.973103 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:29 crc kubenswrapper[4958]: I0320 09:01:29.994082 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:29Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.023696 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.032968 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.033044 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.033059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.033083 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.033099 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.055974 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.076686 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.094660 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.112680 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.128807 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.143325 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.143367 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.143384 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.143399 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.143413 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.143548 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.161195 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.176743 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.192263 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.206012 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.251835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.251918 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.251932 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.251953 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.251969 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.355046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.355127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.355162 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.355194 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.355219 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.451341 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.457379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.457426 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.457435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.457453 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.457464 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.466855 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.483764 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.499663 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.521432 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.544942 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.559542 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.559614 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.559630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.559650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.559663 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.562800 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.594878 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.612066 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.629720 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.643540 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.653880 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.663134 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.663186 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.663199 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.663220 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.663234 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.667799 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.767059 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.767102 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.767113 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.767136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.767146 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.870627 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.870693 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.870705 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.870726 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.870739 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.960711 4958 generic.go:334] "Generic (PLEG): container finished" podID="31474b1f-5bf9-4201-95c2-864df0fed1d0" containerID="21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac" exitCode=0 Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.960776 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerDied","Data":"21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.973349 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.973415 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.973435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.973468 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.973503 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:30Z","lastTransitionTime":"2026-03-20T09:01:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:30 crc kubenswrapper[4958]: I0320 09:01:30.982051 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:30Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.011807 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.031847 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.049417 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.066269 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.081550 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.094306 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.110811 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.115992 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.116040 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.116056 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.116080 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.116095 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.133732 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.156159 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.170805 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.185087 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.206986 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.218835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.218866 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.218874 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.218891 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.218904 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.321760 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.321812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.321829 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.321850 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.321873 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.423941 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.423998 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.424008 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.424026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.424037 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.434292 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.434333 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.434299 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:31 crc kubenswrapper[4958]: E0320 09:01:31.434474 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:31 crc kubenswrapper[4958]: E0320 09:01:31.434672 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:31 crc kubenswrapper[4958]: E0320 09:01:31.434751 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.527013 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.527056 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.527064 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.527080 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.527099 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.630261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.630336 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.630350 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.630382 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.630408 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.733745 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.733812 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.733825 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.733844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.733857 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.836660 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.836731 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.836747 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.836773 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.836791 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.939615 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.939672 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.939683 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.939702 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.939714 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:31Z","lastTransitionTime":"2026-03-20T09:01:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.970248 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.972882 4958 generic.go:334] "Generic (PLEG): container finished" podID="31474b1f-5bf9-4201-95c2-864df0fed1d0" containerID="c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a" exitCode=0 Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.972944 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerDied","Data":"c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a"} Mar 20 09:01:31 crc kubenswrapper[4958]: I0320 09:01:31.988391 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:31Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.001816 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.023423 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.042090 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.042130 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.042141 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.042160 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.042173 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.045686 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.071445 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.090834 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.106410 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.120673 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.144320 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.144561 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.144686 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.144715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.144754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.144782 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.163032 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.177689 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.192158 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.210344 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.247505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.247546 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.247556 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.247573 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.247584 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.350516 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.350565 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.350579 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.350614 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.350633 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.455432 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.455478 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.455488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.455505 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.455516 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.512724 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-25jgh"] Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.513265 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.516006 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.516124 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.516187 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.517274 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.527064 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.542128 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.557786 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.557820 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.557834 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.557851 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.557863 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.558582 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.584345 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.605823 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.623449 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.629201 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcs6t\" (UniqueName: \"kubernetes.io/projected/0947786e-ea2f-478d-b90f-c8f9d33e9999-kube-api-access-lcs6t\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.629286 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0947786e-ea2f-478d-b90f-c8f9d33e9999-serviceca\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.629329 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0947786e-ea2f-478d-b90f-c8f9d33e9999-host\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.638487 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.652148 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.660560 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.660625 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.660636 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.660656 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.660667 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.666720 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.680465 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.696335 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.713944 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.728478 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.731147 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcs6t\" (UniqueName: \"kubernetes.io/projected/0947786e-ea2f-478d-b90f-c8f9d33e9999-kube-api-access-lcs6t\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.731220 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0947786e-ea2f-478d-b90f-c8f9d33e9999-serviceca\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.731295 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0947786e-ea2f-478d-b90f-c8f9d33e9999-host\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.731413 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0947786e-ea2f-478d-b90f-c8f9d33e9999-host\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.734367 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/0947786e-ea2f-478d-b90f-c8f9d33e9999-serviceca\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.748882 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.753218 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcs6t\" (UniqueName: \"kubernetes.io/projected/0947786e-ea2f-478d-b90f-c8f9d33e9999-kube-api-access-lcs6t\") pod \"node-ca-25jgh\" (UID: \"0947786e-ea2f-478d-b90f-c8f9d33e9999\") " pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.763679 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.763727 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.763738 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.763771 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.763791 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.834032 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-25jgh" Mar 20 09:01:32 crc kubenswrapper[4958]: W0320 09:01:32.856564 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0947786e_ea2f_478d_b90f_c8f9d33e9999.slice/crio-1435de47edc2f5b934dc4a5c0c219b98ecd4d0c06912537a8f250777f99eaee7 WatchSource:0}: Error finding container 1435de47edc2f5b934dc4a5c0c219b98ecd4d0c06912537a8f250777f99eaee7: Status 404 returned error can't find the container with id 1435de47edc2f5b934dc4a5c0c219b98ecd4d0c06912537a8f250777f99eaee7 Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.866483 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.866516 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.866527 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.866544 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.866557 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.969894 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.969943 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.969956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.969979 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.969992 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:32Z","lastTransitionTime":"2026-03-20T09:01:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.979693 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerStarted","Data":"ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.980904 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-25jgh" event={"ID":"0947786e-ea2f-478d-b90f-c8f9d33e9999","Type":"ContainerStarted","Data":"1435de47edc2f5b934dc4a5c0c219b98ecd4d0c06912537a8f250777f99eaee7"} Mar 20 09:01:32 crc kubenswrapper[4958]: I0320 09:01:32.993336 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:32Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.006390 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.019801 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.039353 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.059335 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.074417 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.081335 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.081394 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.081407 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.081426 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.081438 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.090784 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.102335 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.117731 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.134532 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.148881 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.164681 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.177639 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.183509 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.183588 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.183629 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.183650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.183661 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.196172 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:33Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.286607 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.286667 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.286682 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.286703 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.286718 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.389583 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.389731 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.389751 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.390198 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.390255 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.434833 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.434850 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:33 crc kubenswrapper[4958]: E0320 09:01:33.435032 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:33 crc kubenswrapper[4958]: E0320 09:01:33.435151 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.434854 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:33 crc kubenswrapper[4958]: E0320 09:01:33.435268 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.493136 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.493175 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.493185 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.493201 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.493212 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.596164 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.596223 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.596236 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.596259 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.596273 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.699114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.699205 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.699227 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.699256 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.699274 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.802143 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.802221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.802237 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.802259 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.802272 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.905870 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.905956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.905973 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.905994 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.906007 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:33Z","lastTransitionTime":"2026-03-20T09:01:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.990471 4958 generic.go:334] "Generic (PLEG): container finished" podID="31474b1f-5bf9-4201-95c2-864df0fed1d0" containerID="ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5" exitCode=0 Mar 20 09:01:33 crc kubenswrapper[4958]: I0320 09:01:33.990533 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerDied","Data":"ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.004076 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.005022 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.005077 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.005099 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.007859 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.007839 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.007959 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-25jgh" event={"ID":"0947786e-ea2f-478d-b90f-c8f9d33e9999","Type":"ContainerStarted","Data":"9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.007901 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.008048 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.008069 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.008081 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.025109 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.042119 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.044144 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.049886 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.061728 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.087699 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.104468 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.111168 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.111214 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.111226 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.111244 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.111259 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.122207 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.135368 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.149142 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.167055 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.183542 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.199451 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.215401 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.215642 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.215724 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.215734 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.215753 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.215763 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.237954 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.254266 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.268007 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.279684 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.292768 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.307806 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.319801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.319857 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.319872 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.319896 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.319913 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.330837 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.345686 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.360204 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.374721 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.388074 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.403236 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.420989 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.423306 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.423342 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.423352 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.423371 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.423384 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.440486 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.470180 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:34Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.526290 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.526343 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.526354 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.526374 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.526388 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.629865 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.629927 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.629938 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.629958 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.629970 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.733336 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.733386 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.733396 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.733417 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.733429 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.836073 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.836121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.836131 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.836150 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.836160 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.938945 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.938997 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.939009 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.939028 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:34 crc kubenswrapper[4958]: I0320 09:01:34.939042 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:34Z","lastTransitionTime":"2026-03-20T09:01:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.015951 4958 generic.go:334] "Generic (PLEG): container finished" podID="31474b1f-5bf9-4201-95c2-864df0fed1d0" containerID="41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c" exitCode=0 Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.016040 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerDied","Data":"41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.032015 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.042378 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.042431 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.042441 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.042462 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.042472 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.046201 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.063537 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.076032 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.098590 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.113962 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.136021 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.146184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.146229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.146239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.146255 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.146264 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.162913 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.185884 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.202003 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.217455 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.232924 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.249229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.249267 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.249276 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.249294 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.249305 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.249274 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.282131 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:35Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.352361 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.352408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.352423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.352440 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.352450 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.434460 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.434556 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:35 crc kubenswrapper[4958]: E0320 09:01:35.434645 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:35 crc kubenswrapper[4958]: E0320 09:01:35.434782 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.434878 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:35 crc kubenswrapper[4958]: E0320 09:01:35.434939 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.454632 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.454679 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.454691 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.454727 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.454765 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.556920 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.556954 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.556964 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.556977 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.556987 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.660723 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.660978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.660990 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.661014 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.661028 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.764201 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.764245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.764254 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.764268 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.764279 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.867673 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.868151 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.868167 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.868184 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.868196 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.971438 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.971488 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.971499 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.971516 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:35 crc kubenswrapper[4958]: I0320 09:01:35.971530 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:35Z","lastTransitionTime":"2026-03-20T09:01:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.027183 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" event={"ID":"31474b1f-5bf9-4201-95c2-864df0fed1d0","Type":"ContainerStarted","Data":"c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.042327 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.054681 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.069956 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.074243 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.074292 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.074334 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.074352 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.074363 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.082589 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.100441 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.116174 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.144869 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.172247 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.178762 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.178804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.178817 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.178835 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.178844 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.193320 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.210459 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.224219 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.249306 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.264530 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.286828 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.286889 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.286902 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.286921 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.286932 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.286906 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:36Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.395784 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.395844 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.395855 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.395876 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.395889 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.498727 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.498788 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.498801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.498822 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.498835 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.602283 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.602330 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.602340 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.602359 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.602370 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.705437 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.705490 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.705501 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.705518 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.705533 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.808757 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.808804 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.808813 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.808830 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.808842 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.911573 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.911637 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.911650 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.911666 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:36 crc kubenswrapper[4958]: I0320 09:01:36.911677 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:36Z","lastTransitionTime":"2026-03-20T09:01:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.014157 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.014521 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.014533 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.014554 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.014566 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.117180 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.117239 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.117250 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.117269 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.117280 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.219270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.219315 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.219327 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.219343 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.219356 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.304487 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.304703 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.304771 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.304820 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.304890 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:09.304830382 +0000 UTC m=+149.626846380 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.304925 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.304956 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305011 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305032 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:09.304997126 +0000 UTC m=+149.627013134 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305047 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.304963 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.305006 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305127 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:09.305100869 +0000 UTC m=+149.627117007 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305173 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:09.305152551 +0000 UTC m=+149.627168749 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305208 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305248 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305280 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.305371 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:09.305347586 +0000 UTC m=+149.627363714 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.322245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.322322 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.322345 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.322377 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.322395 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.424625 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.424701 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.424750 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.424793 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.424818 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.433785 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.433861 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.433983 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.434131 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.434288 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:37 crc kubenswrapper[4958]: E0320 09:01:37.434434 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.528144 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.528193 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.528204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.528221 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.528232 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.631124 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.631173 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.631186 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.631204 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.631215 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.734191 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.734251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.734265 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.734290 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.734307 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.837615 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.837681 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.837695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.837715 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.837730 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.941034 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.941097 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.941121 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.941147 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:37 crc kubenswrapper[4958]: I0320 09:01:37.941164 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:37Z","lastTransitionTime":"2026-03-20T09:01:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.045701 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.045755 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.045767 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.045788 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.045803 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.049330 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/0.log" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.052027 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1" exitCode=1 Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.052074 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.052968 4958 scope.go:117] "RemoveContainer" containerID="21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.068833 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.083835 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.113716 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.129218 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.143995 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.149853 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.149890 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.149900 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.149917 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.149929 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.160337 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.186387 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.210889 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.226831 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.242461 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.252452 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.252496 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.252510 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.252530 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.252542 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.258366 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.271337 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.287654 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.304998 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.355543 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.355627 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.355644 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.355664 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.355674 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.458730 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.458772 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.458781 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.458800 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.458812 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.523733 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll"] Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.524403 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.527101 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.527156 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.538457 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.552301 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.561058 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.561100 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.561111 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.561127 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.561140 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.570977 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.584080 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.607531 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.621797 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e8af587c-3589-43ca-800d-f908c8e18cbb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.621855 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e8af587c-3589-43ca-800d-f908c8e18cbb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.621919 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhbjv\" (UniqueName: \"kubernetes.io/projected/e8af587c-3589-43ca-800d-f908c8e18cbb-kube-api-access-bhbjv\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.621967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e8af587c-3589-43ca-800d-f908c8e18cbb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.622661 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.640473 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.647772 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.656123 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.663222 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.663263 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.663274 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.663292 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.663305 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.675278 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.689722 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.702427 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.717216 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.722877 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhbjv\" (UniqueName: \"kubernetes.io/projected/e8af587c-3589-43ca-800d-f908c8e18cbb-kube-api-access-bhbjv\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.722928 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e8af587c-3589-43ca-800d-f908c8e18cbb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.722957 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e8af587c-3589-43ca-800d-f908c8e18cbb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.722980 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e8af587c-3589-43ca-800d-f908c8e18cbb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.723704 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e8af587c-3589-43ca-800d-f908c8e18cbb-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.723873 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e8af587c-3589-43ca-800d-f908c8e18cbb-env-overrides\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.731495 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e8af587c-3589-43ca-800d-f908c8e18cbb-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.733674 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.752163 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhbjv\" (UniqueName: \"kubernetes.io/projected/e8af587c-3589-43ca-800d-f908c8e18cbb-kube-api-access-bhbjv\") pod \"ovnkube-control-plane-749d76644c-42cll\" (UID: \"e8af587c-3589-43ca-800d-f908c8e18cbb\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.752454 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.766208 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.766261 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.766273 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.766297 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.766315 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.776260 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.802229 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.824014 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.881529 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.881504 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.884573 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.884619 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.884630 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.884647 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.884657 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.904276 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.925452 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.962272 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.981258 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.986847 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.986922 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.986936 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.986978 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.986991 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:38Z","lastTransitionTime":"2026-03-20T09:01:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:38 crc kubenswrapper[4958]: I0320 09:01:38.996651 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:38Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.006395 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.017735 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.029808 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.042347 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.057965 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/0.log" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.058754 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.062116 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.062893 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.062989 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" event={"ID":"e8af587c-3589-43ca-800d-f908c8e18cbb","Type":"ContainerStarted","Data":"ce9642497985acc470f16cd0e64990cf070bd28f0051a864334bba6b9cfc7d8e"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.072303 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.089413 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.089457 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.089470 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.089487 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.089499 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.089382 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.105293 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.117426 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.137995 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.156273 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.172924 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.190379 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.193728 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.193762 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.193772 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.193790 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.193803 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.203606 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.214935 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.228143 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.244113 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.259301 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-trr7n"] Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.260540 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.260713 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.267109 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.277620 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.291154 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.297495 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.297558 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.297570 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.297591 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.297624 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.314346 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.339316 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.356346 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.376537 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.379988 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.380046 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.380060 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.380088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.380106 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.392790 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.396465 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.402811 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.402857 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.402866 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.402888 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.402899 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.415898 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.417000 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.420287 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.420423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.420502 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.420567 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.420665 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.432431 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdpxg\" (UniqueName: \"kubernetes.io/projected/14288bf2-b6fe-4961-ad00-a39f76ff1a78-kube-api-access-xdpxg\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.432535 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.433331 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.433736 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.433813 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.433868 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.433736 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.433963 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.434116 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.434658 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.437210 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.437238 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.437251 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.437270 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.437282 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.447273 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.449064 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.451026 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.451062 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.451072 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.451088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.451100 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.463313 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.464045 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.464173 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.465641 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.465676 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.465688 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.465706 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.465720 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.482212 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.496111 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.510071 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.524090 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.533854 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.533917 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdpxg\" (UniqueName: \"kubernetes.io/projected/14288bf2-b6fe-4961-ad00-a39f76ff1a78-kube-api-access-xdpxg\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.534023 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:39 crc kubenswrapper[4958]: E0320 09:01:39.534108 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:40.034085503 +0000 UTC m=+120.356101461 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.538611 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.560555 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdpxg\" (UniqueName: \"kubernetes.io/projected/14288bf2-b6fe-4961-ad00-a39f76ff1a78-kube-api-access-xdpxg\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.561130 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.570359 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.570407 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.570417 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.570435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.570446 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.576729 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.601997 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.615417 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:39Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.672996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.673077 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.673088 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.673106 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.673118 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.776516 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.776567 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.776584 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.776626 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.776639 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.879587 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.879663 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.879675 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.879695 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.879709 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.982474 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.982818 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.982956 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.983030 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:39 crc kubenswrapper[4958]: I0320 09:01:39.983087 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:39Z","lastTransitionTime":"2026-03-20T09:01:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.038178 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:40 crc kubenswrapper[4958]: E0320 09:01:40.038576 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:40 crc kubenswrapper[4958]: E0320 09:01:40.038760 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:41.038737829 +0000 UTC m=+121.360753787 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.068803 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/1.log" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.070260 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/0.log" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.073010 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0" exitCode=1 Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.073103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.073230 4958 scope.go:117] "RemoveContainer" containerID="21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.073925 4958 scope.go:117] "RemoveContainer" containerID="879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0" Mar 20 09:01:40 crc kubenswrapper[4958]: E0320 09:01:40.074121 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.075168 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" event={"ID":"e8af587c-3589-43ca-800d-f908c8e18cbb","Type":"ContainerStarted","Data":"a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.075369 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" event={"ID":"e8af587c-3589-43ca-800d-f908c8e18cbb","Type":"ContainerStarted","Data":"b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.084979 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.085019 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.085031 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.085052 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.085066 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:40Z","lastTransitionTime":"2026-03-20T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.088614 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.106240 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.121028 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.140117 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.154249 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.168647 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.182730 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.187704 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.187754 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.187787 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.187809 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.187820 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:40Z","lastTransitionTime":"2026-03-20T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.206791 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.219999 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.237213 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.251500 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.274418 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.289644 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.290862 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.291051 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.291065 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.291085 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.291100 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:40Z","lastTransitionTime":"2026-03-20T09:01:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.309663 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.323293 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.334969 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.349367 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.369293 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.385628 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: E0320 09:01:40.392935 4958 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.411677 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.425329 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.433854 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:40 crc kubenswrapper[4958]: E0320 09:01:40.434020 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.438326 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.449074 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.459420 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.470740 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.484236 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.497222 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.509554 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.523538 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.536814 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.550394 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: E0320 09:01:40.556415 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.564993 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.576723 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.596007 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c5d0cd7577821e14c2fc9c3e2c851788481fbf926df3436f6c452827186ac1\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:37Z\\\",\\\"message\\\":\\\" reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.101802 6792 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:37.102199 6792 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.102258 6792 reflector.go:311] Stopping reflector *v1.Pod (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:37.103065 6792 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 09:01:37.103108 6792 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 09:01:37.103123 6792 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 09:01:37.103129 6792 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 09:01:37.103144 6792 factory.go:656] Stopping watch factory\\\\nI0320 09:01:37.103186 6792 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 09:01:37.103201 6792 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 09:01:37.103226 6792 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:37.103265 6792 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.610338 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.632435 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.652387 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.677270 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.707162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.745266 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.788329 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.833675 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.870749 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.910781 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.953012 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:40 crc kubenswrapper[4958]: I0320 09:01:40.989251 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:40Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.027731 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.047887 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:41 crc kubenswrapper[4958]: E0320 09:01:41.048282 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:41 crc kubenswrapper[4958]: E0320 09:01:41.048529 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:43.048504915 +0000 UTC m=+123.370520873 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.064917 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.084924 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/1.log" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.094784 4958 scope.go:117] "RemoveContainer" containerID="879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0" Mar 20 09:01:41 crc kubenswrapper[4958]: E0320 09:01:41.094985 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.124204 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.167349 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.189049 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.223365 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.268677 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.310083 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.353046 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.391135 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.430986 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.433805 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.433964 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.433935 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:41 crc kubenswrapper[4958]: E0320 09:01:41.434115 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:41 crc kubenswrapper[4958]: E0320 09:01:41.434311 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:41 crc kubenswrapper[4958]: E0320 09:01:41.434347 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.467716 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.510632 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.546015 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.585042 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.626172 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.673157 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:41 crc kubenswrapper[4958]: I0320 09:01:41.705967 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:41Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:42 crc kubenswrapper[4958]: I0320 09:01:42.434774 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:42 crc kubenswrapper[4958]: E0320 09:01:42.435038 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:43 crc kubenswrapper[4958]: I0320 09:01:43.065483 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:43 crc kubenswrapper[4958]: E0320 09:01:43.065767 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:43 crc kubenswrapper[4958]: E0320 09:01:43.065875 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:47.065850672 +0000 UTC m=+127.387866630 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:43 crc kubenswrapper[4958]: I0320 09:01:43.434449 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:43 crc kubenswrapper[4958]: I0320 09:01:43.434545 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:43 crc kubenswrapper[4958]: E0320 09:01:43.434634 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:43 crc kubenswrapper[4958]: E0320 09:01:43.434808 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:43 crc kubenswrapper[4958]: I0320 09:01:43.434921 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:43 crc kubenswrapper[4958]: E0320 09:01:43.435183 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:44 crc kubenswrapper[4958]: I0320 09:01:44.434223 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:44 crc kubenswrapper[4958]: E0320 09:01:44.434400 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:45 crc kubenswrapper[4958]: I0320 09:01:45.434204 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:45 crc kubenswrapper[4958]: I0320 09:01:45.434251 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:45 crc kubenswrapper[4958]: E0320 09:01:45.435405 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:45 crc kubenswrapper[4958]: I0320 09:01:45.434283 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:45 crc kubenswrapper[4958]: E0320 09:01:45.435588 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:45 crc kubenswrapper[4958]: E0320 09:01:45.435433 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:45 crc kubenswrapper[4958]: E0320 09:01:45.558276 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:01:46 crc kubenswrapper[4958]: I0320 09:01:46.434302 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:46 crc kubenswrapper[4958]: E0320 09:01:46.435835 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:47 crc kubenswrapper[4958]: I0320 09:01:47.107254 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:47 crc kubenswrapper[4958]: E0320 09:01:47.107584 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:47 crc kubenswrapper[4958]: E0320 09:01:47.107932 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:01:55.107905119 +0000 UTC m=+135.429921097 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:47 crc kubenswrapper[4958]: I0320 09:01:47.434112 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:47 crc kubenswrapper[4958]: I0320 09:01:47.434192 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:47 crc kubenswrapper[4958]: I0320 09:01:47.434271 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:47 crc kubenswrapper[4958]: E0320 09:01:47.434400 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:47 crc kubenswrapper[4958]: E0320 09:01:47.434294 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:47 crc kubenswrapper[4958]: E0320 09:01:47.434550 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:48 crc kubenswrapper[4958]: I0320 09:01:48.434291 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:48 crc kubenswrapper[4958]: E0320 09:01:48.434477 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.434196 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.434266 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.434356 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.434417 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.434518 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.434642 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.756171 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.756219 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.756229 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.756245 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.756256 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:49Z","lastTransitionTime":"2026-03-20T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.770762 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:49Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.775953 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.776020 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.776033 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.776056 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.776071 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:49Z","lastTransitionTime":"2026-03-20T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.790759 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:49Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.795053 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.795104 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.795114 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.795134 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.795148 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:49Z","lastTransitionTime":"2026-03-20T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.816313 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:49Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.821284 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.821343 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.821358 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.821379 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.821391 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:49Z","lastTransitionTime":"2026-03-20T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.835865 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:49Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.840931 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.840999 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.841013 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.841033 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:01:49 crc kubenswrapper[4958]: I0320 09:01:49.841047 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:01:49Z","lastTransitionTime":"2026-03-20T09:01:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.855690 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:49Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:49 crc kubenswrapper[4958]: E0320 09:01:49.855867 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.434152 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:50 crc kubenswrapper[4958]: E0320 09:01:50.434346 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.450429 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.477988 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.495087 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.509631 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.528152 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.545525 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: E0320 09:01:50.559463 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.563090 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.614962 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.641017 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.658858 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.676397 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.695150 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.708916 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.722474 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.738635 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:50 crc kubenswrapper[4958]: I0320 09:01:50.753672 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:50Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:51 crc kubenswrapper[4958]: I0320 09:01:51.434161 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:51 crc kubenswrapper[4958]: I0320 09:01:51.434263 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:51 crc kubenswrapper[4958]: E0320 09:01:51.434340 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:51 crc kubenswrapper[4958]: E0320 09:01:51.434519 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:51 crc kubenswrapper[4958]: I0320 09:01:51.434284 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:51 crc kubenswrapper[4958]: E0320 09:01:51.434691 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:52 crc kubenswrapper[4958]: I0320 09:01:52.434555 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:52 crc kubenswrapper[4958]: E0320 09:01:52.434777 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:53 crc kubenswrapper[4958]: I0320 09:01:53.433766 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:53 crc kubenswrapper[4958]: I0320 09:01:53.433834 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:53 crc kubenswrapper[4958]: I0320 09:01:53.433918 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:53 crc kubenswrapper[4958]: E0320 09:01:53.433970 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:53 crc kubenswrapper[4958]: E0320 09:01:53.434082 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:53 crc kubenswrapper[4958]: E0320 09:01:53.434158 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:54 crc kubenswrapper[4958]: I0320 09:01:54.434441 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:54 crc kubenswrapper[4958]: E0320 09:01:54.434733 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:55 crc kubenswrapper[4958]: I0320 09:01:55.203713 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:55 crc kubenswrapper[4958]: E0320 09:01:55.203921 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:55 crc kubenswrapper[4958]: E0320 09:01:55.204012 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:11.203987663 +0000 UTC m=+151.526003611 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:01:55 crc kubenswrapper[4958]: I0320 09:01:55.434248 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:55 crc kubenswrapper[4958]: I0320 09:01:55.434836 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:55 crc kubenswrapper[4958]: E0320 09:01:55.434919 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:55 crc kubenswrapper[4958]: I0320 09:01:55.435034 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:55 crc kubenswrapper[4958]: E0320 09:01:55.435536 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:55 crc kubenswrapper[4958]: I0320 09:01:55.435653 4958 scope.go:117] "RemoveContainer" containerID="879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0" Mar 20 09:01:55 crc kubenswrapper[4958]: E0320 09:01:55.435755 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:55 crc kubenswrapper[4958]: I0320 09:01:55.444987 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 09:01:55 crc kubenswrapper[4958]: E0320 09:01:55.560898 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.151371 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/1.log" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.154172 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7"} Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.154975 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.171370 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.204898 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.229051 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.257072 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.273168 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.288778 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.304567 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.372981 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.390158 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.406038 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.420162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.434160 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:56 crc kubenswrapper[4958]: E0320 09:01:56.434543 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.435445 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.456893 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.476188 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.489241 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.503520 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"487f802c-61be-42b6-81ad-cc9f43b877f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://273ff1ec96f2de36f5ffa6ab14769c02adebefee79570067e577bd3dd785cdba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff04bc5b55cc7332602831d11a7597b8831883b5dc8d90fbcb7b655ec359fae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cb1bd5af090297500b89f5c67d052147fcd6f42f6e49f3fc26d1525998439f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:56 crc kubenswrapper[4958]: I0320 09:01:56.516711 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:56Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.167502 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/2.log" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.168219 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/1.log" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.170976 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" exitCode=1 Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.171024 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7"} Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.171070 4958 scope.go:117] "RemoveContainer" containerID="879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.171999 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:01:57 crc kubenswrapper[4958]: E0320 09:01:57.172184 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.191080 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.211464 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://879e75f08c4c8b4552159a83ab9b38194b4bd9c6308f7583b21ae01721f2b9e0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"s/factory.go:140\\\\nI0320 09:01:39.296185 6957 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 09:01:39.294853 6957 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.296280 6957 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 09:01:39.294883 6957 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 09:01:39.294917 6957 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.295036 6957 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 09:01:39.295237 6957 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 09:01:39.297734 6957 factory.go:656] Stopping watch factory\\\\nI0320 09:01:39.300643 6957 ovnkube.go:599] Stopped ovnkube\\\\nI0320 09:01:39.300697 6957 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:39.300790 6957 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:56Z\\\",\\\"message\\\":\\\"openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll openshift-ovn-kubernetes/ovnkube-node-tmjtz openshift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0320 09:01:56.716399 7230 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 09:01:56.716418 7230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716430 7230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716441 7230 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0320 09:01:56.716447 7230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0320 09:01:56.716452 7230 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716473 7230 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:56.716556 7230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.224747 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.238754 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.250868 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.264644 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.274627 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.292644 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.310326 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.325162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.340161 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.350377 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.361279 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.376448 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.389380 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.402252 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"487f802c-61be-42b6-81ad-cc9f43b877f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://273ff1ec96f2de36f5ffa6ab14769c02adebefee79570067e577bd3dd785cdba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff04bc5b55cc7332602831d11a7597b8831883b5dc8d90fbcb7b655ec359fae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cb1bd5af090297500b89f5c67d052147fcd6f42f6e49f3fc26d1525998439f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.417931 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:57Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.434561 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:57 crc kubenswrapper[4958]: E0320 09:01:57.434777 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.434798 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:57 crc kubenswrapper[4958]: I0320 09:01:57.434913 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:57 crc kubenswrapper[4958]: E0320 09:01:57.434949 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:57 crc kubenswrapper[4958]: E0320 09:01:57.435102 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.176190 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/2.log" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.181155 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:01:58 crc kubenswrapper[4958]: E0320 09:01:58.181466 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.194991 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.213094 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.227532 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.241672 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.254369 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.268261 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"487f802c-61be-42b6-81ad-cc9f43b877f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://273ff1ec96f2de36f5ffa6ab14769c02adebefee79570067e577bd3dd785cdba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff04bc5b55cc7332602831d11a7597b8831883b5dc8d90fbcb7b655ec359fae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cb1bd5af090297500b89f5c67d052147fcd6f42f6e49f3fc26d1525998439f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.282295 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.294799 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.312852 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:56Z\\\",\\\"message\\\":\\\"openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll openshift-ovn-kubernetes/ovnkube-node-tmjtz openshift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0320 09:01:56.716399 7230 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 09:01:56.716418 7230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716430 7230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716441 7230 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0320 09:01:56.716447 7230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0320 09:01:56.716452 7230 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716473 7230 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:56.716556 7230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.326657 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.342905 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.354957 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.375225 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.390162 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.405180 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.420510 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.433206 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:01:58Z is after 2025-08-24T17:21:41Z" Mar 20 09:01:58 crc kubenswrapper[4958]: I0320 09:01:58.434527 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:01:58 crc kubenswrapper[4958]: E0320 09:01:58.434698 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:01:59 crc kubenswrapper[4958]: I0320 09:01:59.433847 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:01:59 crc kubenswrapper[4958]: I0320 09:01:59.433955 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:01:59 crc kubenswrapper[4958]: E0320 09:01:59.434074 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:01:59 crc kubenswrapper[4958]: I0320 09:01:59.434117 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:01:59 crc kubenswrapper[4958]: E0320 09:01:59.434226 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:01:59 crc kubenswrapper[4958]: E0320 09:01:59.434299 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.056930 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.056996 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.057015 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.057041 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.057057 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:00Z","lastTransitionTime":"2026-03-20T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.077783 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.082534 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.082572 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.082585 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.082623 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.082638 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:00Z","lastTransitionTime":"2026-03-20T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.096135 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.100879 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.100929 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.100944 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.100965 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.100980 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:00Z","lastTransitionTime":"2026-03-20T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.118026 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.123356 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.123402 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.123411 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.123427 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.123440 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:00Z","lastTransitionTime":"2026-03-20T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.138006 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.142576 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.142648 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.142663 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.142684 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.142698 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:00Z","lastTransitionTime":"2026-03-20T09:02:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.156541 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.156694 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.434007 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.434194 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.449590 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.470675 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:56Z\\\",\\\"message\\\":\\\"openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll openshift-ovn-kubernetes/ovnkube-node-tmjtz openshift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0320 09:01:56.716399 7230 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 09:01:56.716418 7230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716430 7230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716441 7230 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0320 09:01:56.716447 7230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0320 09:01:56.716452 7230 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716473 7230 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:56.716556 7230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.490501 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.504552 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.519364 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.530923 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.544130 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.557460 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: E0320 09:02:00.562326 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.574040 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.586512 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.601010 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.615728 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.629673 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.641994 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.653181 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.670028 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"487f802c-61be-42b6-81ad-cc9f43b877f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://273ff1ec96f2de36f5ffa6ab14769c02adebefee79570067e577bd3dd785cdba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff04bc5b55cc7332602831d11a7597b8831883b5dc8d90fbcb7b655ec359fae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cb1bd5af090297500b89f5c67d052147fcd6f42f6e49f3fc26d1525998439f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:00 crc kubenswrapper[4958]: I0320 09:02:00.684486 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:00Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:01 crc kubenswrapper[4958]: I0320 09:02:01.433730 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:01 crc kubenswrapper[4958]: I0320 09:02:01.433846 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:01 crc kubenswrapper[4958]: E0320 09:02:01.433910 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:01 crc kubenswrapper[4958]: E0320 09:02:01.434027 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:01 crc kubenswrapper[4958]: I0320 09:02:01.434124 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:01 crc kubenswrapper[4958]: E0320 09:02:01.434183 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:02 crc kubenswrapper[4958]: I0320 09:02:02.433814 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:02 crc kubenswrapper[4958]: E0320 09:02:02.433965 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:03 crc kubenswrapper[4958]: I0320 09:02:03.434389 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:03 crc kubenswrapper[4958]: I0320 09:02:03.434510 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:03 crc kubenswrapper[4958]: I0320 09:02:03.434417 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:03 crc kubenswrapper[4958]: E0320 09:02:03.434673 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:03 crc kubenswrapper[4958]: E0320 09:02:03.434876 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:03 crc kubenswrapper[4958]: E0320 09:02:03.435067 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:04 crc kubenswrapper[4958]: I0320 09:02:04.434758 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:04 crc kubenswrapper[4958]: E0320 09:02:04.435269 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:04 crc kubenswrapper[4958]: I0320 09:02:04.448200 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 09:02:05 crc kubenswrapper[4958]: I0320 09:02:05.434820 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:05 crc kubenswrapper[4958]: I0320 09:02:05.434820 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:05 crc kubenswrapper[4958]: I0320 09:02:05.434848 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:05 crc kubenswrapper[4958]: E0320 09:02:05.435230 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:05 crc kubenswrapper[4958]: E0320 09:02:05.434969 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:05 crc kubenswrapper[4958]: E0320 09:02:05.435294 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:05 crc kubenswrapper[4958]: E0320 09:02:05.564185 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:02:06 crc kubenswrapper[4958]: I0320 09:02:06.434405 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:06 crc kubenswrapper[4958]: E0320 09:02:06.434611 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:07 crc kubenswrapper[4958]: I0320 09:02:07.434776 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:07 crc kubenswrapper[4958]: I0320 09:02:07.434805 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:07 crc kubenswrapper[4958]: E0320 09:02:07.434941 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:07 crc kubenswrapper[4958]: I0320 09:02:07.434980 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:07 crc kubenswrapper[4958]: E0320 09:02:07.435112 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:07 crc kubenswrapper[4958]: E0320 09:02:07.435162 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:08 crc kubenswrapper[4958]: I0320 09:02:08.434850 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:08 crc kubenswrapper[4958]: E0320 09:02:08.435029 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.370067 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.370241 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370390 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:03:13.370339299 +0000 UTC m=+213.692355297 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370491 4958 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.370521 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370628 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:13.370580277 +0000 UTC m=+213.692596456 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.370656 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370700 4958 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.370726 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370744 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:13.370732502 +0000 UTC m=+213.692748460 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370873 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370886 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370899 4958 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370924 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370940 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:13.370930198 +0000 UTC m=+213.692946156 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370953 4958 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.370971 4958 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.371045 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 09:03:13.371020981 +0000 UTC m=+213.693037139 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.434288 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.434416 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.434458 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.434539 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.434639 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.434722 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:09 crc kubenswrapper[4958]: I0320 09:02:09.435522 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:02:09 crc kubenswrapper[4958]: E0320 09:02:09.435904 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\"" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.264380 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.264435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.264448 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.264464 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.264475 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:10Z","lastTransitionTime":"2026-03-20T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.282893 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.287363 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.287399 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.287408 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.287423 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.287437 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:10Z","lastTransitionTime":"2026-03-20T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.301082 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.305914 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.305957 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.305972 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.305992 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.306006 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:10Z","lastTransitionTime":"2026-03-20T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.319478 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.324381 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.324435 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.324449 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.324468 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.324478 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:10Z","lastTransitionTime":"2026-03-20T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.338425 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.342520 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.342556 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.342567 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.342583 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.342609 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:10Z","lastTransitionTime":"2026-03-20T09:02:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.356926 4958 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404564Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865364Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f885a277-9b85-4e30-8d86-f10d1510a78a\\\",\\\"systemUUID\\\":\\\"4d937261-ad72-4cd3-9e28-1484a891ee0d\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.357082 4958 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.434513 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.434699 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.451165 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e6b0906b-8384-49d8-8273-062c7e86148c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:50Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f883f0a9bc518bf7cdbdcef43df507c8e7162636bb3bcfe5dcacd54b3fd8dfed\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://79a45e31d9a110cf93cb0a64d57274448e46eb9eda8456969224a588d9d9c96b\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 09:00:10.175538 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 09:00:10.177098 1 observer_polling.go:159] Starting file observer\\\\nI0320 09:00:10.180167 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 09:00:10.181209 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 09:00:39.803720 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 09:00:39.803855 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:10Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:00:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://485288d9b577950a20ea275f1289685b34ff9cf6debe3c6ddc1170b70ff8ef88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c8433267d0628369259445adcb5c89c240d4a22a3f5de354dbf6c19b5e7f20fc\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d42563ceeeca2989e69c343c8e480952171423e86ec7da3f23acf67ea844b52b\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.466535 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"487f802c-61be-42b6-81ad-cc9f43b877f8\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:00:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://273ff1ec96f2de36f5ffa6ab14769c02adebefee79570067e577bd3dd785cdba\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff04bc5b55cc7332602831d11a7597b8831883b5dc8d90fbcb7b655ec359fae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0cb1bd5af090297500b89f5c67d052147fcd6f42f6e49f3fc26d1525998439f0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bd4ffdc7d02d448cd8f88f42bb31d47002eccc6739ebd57a4d8198c9f3dbad73\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.486412 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2ee1bc9d619d69d66f9bd0fc87eb5010ad0dc0d89e3ccd7b2a39b99a38bcf421\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bec758225faefd307d1a80f3e31932d46ca12649bcca11441d2011d1474c81f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.502714 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.526872 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"eb4de400-dc39-4926-8311-279b913e5871\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:01:56Z\\\",\\\"message\\\":\\\"openshift-kube-apiserver/kube-apiserver-crc openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll openshift-ovn-kubernetes/ovnkube-node-tmjtz openshift-etcd/etcd-crc openshift-network-operator/network-operator-58b4c7f79c-55gtf]\\\\nI0320 09:01:56.716399 7230 obj_retry.go:418] Waiting for all the *v1.Pod retry setup to complete in iterateRetryResources\\\\nI0320 09:01:56.716418 7230 obj_retry.go:303] Retry object setup: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716430 7230 obj_retry.go:365] Adding new object: *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716441 7230 ovn.go:134] Ensuring zone local for Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf in node crc\\\\nI0320 09:01:56.716447 7230 obj_retry.go:386] Retry successful for *v1.Pod openshift-network-operator/network-operator-58b4c7f79c-55gtf after 0 failed attempt(s)\\\\nI0320 09:01:56.716452 7230 default_network_controller.go:776] Recording success event on pod openshift-network-operator/network-operator-58b4c7f79c-55gtf\\\\nI0320 09:01:56.716473 7230 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 09:01:56.716556 7230 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:55Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-tmjtz_openshift-ovn-kubernetes(eb4de400-dc39-4926-8311-279b913e5871)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:27Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-gpg76\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-tmjtz\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.550448 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: E0320 09:02:10.566267 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.567046 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.591010 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.608887 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.624723 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.642101 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.654908 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.675813 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.690084 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.704208 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.716931 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.728825 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-25jgh" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"0947786e-ea2f-478d-b90f-c8f9d33e9999\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9ab6500d4a8df3e0c86a5a5f7aa9b8c6c5d0877258b9083642e3a2995da6d359\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lcs6t\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:32Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-25jgh\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:10 crc kubenswrapper[4958]: I0320 09:02:10.740698 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:10Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:11 crc kubenswrapper[4958]: I0320 09:02:11.293787 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:11 crc kubenswrapper[4958]: E0320 09:02:11.293983 4958 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:02:11 crc kubenswrapper[4958]: E0320 09:02:11.294095 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs podName:14288bf2-b6fe-4961-ad00-a39f76ff1a78 nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.294071698 +0000 UTC m=+183.616087656 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs") pod "network-metrics-daemon-trr7n" (UID: "14288bf2-b6fe-4961-ad00-a39f76ff1a78") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 09:02:11 crc kubenswrapper[4958]: I0320 09:02:11.433844 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:11 crc kubenswrapper[4958]: I0320 09:02:11.433882 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:11 crc kubenswrapper[4958]: I0320 09:02:11.433923 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:11 crc kubenswrapper[4958]: E0320 09:02:11.434046 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:11 crc kubenswrapper[4958]: E0320 09:02:11.434172 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:11 crc kubenswrapper[4958]: E0320 09:02:11.434284 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:12 crc kubenswrapper[4958]: I0320 09:02:12.433918 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:12 crc kubenswrapper[4958]: E0320 09:02:12.434086 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:13 crc kubenswrapper[4958]: I0320 09:02:13.434535 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:13 crc kubenswrapper[4958]: I0320 09:02:13.434535 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:13 crc kubenswrapper[4958]: E0320 09:02:13.434723 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:13 crc kubenswrapper[4958]: E0320 09:02:13.434751 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:13 crc kubenswrapper[4958]: I0320 09:02:13.434552 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:13 crc kubenswrapper[4958]: E0320 09:02:13.434836 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:14 crc kubenswrapper[4958]: I0320 09:02:14.433788 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:14 crc kubenswrapper[4958]: E0320 09:02:14.433971 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.240113 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lht4x_1479666a-d3f9-47dc-aa36-45cc7425d7ee/kube-multus/0.log" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.240177 4958 generic.go:334] "Generic (PLEG): container finished" podID="1479666a-d3f9-47dc-aa36-45cc7425d7ee" containerID="b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623" exitCode=1 Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.240219 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lht4x" event={"ID":"1479666a-d3f9-47dc-aa36-45cc7425d7ee","Type":"ContainerDied","Data":"b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623"} Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.240684 4958 scope.go:117] "RemoveContainer" containerID="b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.265699 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d3bb0dff-98a7-4359-841f-5fb469ebc3f4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8e13b6fcdd438dcb103d1498a6805f760996c5deb362ea050e479bcd9d2ef2fe\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-vgtvt\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-kvsdf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.288179 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-wjb45" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"31474b1f-5bf9-4201-95c2-864df0fed1d0\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c0f1ab1e010683f43397466099f5a12e7593a73daa5c0f00c7058e021541c610\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://87c0eb22a4d8a7faddc2fe2bf2e89a58750972dc95b8b84644d5a92aa679ddd3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:27Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ed56ad86a13a25ddd96d512cea27129f7788807ecb66a0b16ff901e803406377\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:29Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://21c94f42ed6b45d46091ec18d2571edf0f6bd2826dbdb176029c7f57014532ac\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:30Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:30Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c067fd0e5d0b5d9eb5cf92465fad0aad5d7128b1ee079f5d5ca65f8b3122186a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:31Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:31Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea525ef639336fcdafc6a66ff9bb2ca1bf884f08881cc8374e5f45e2f6fb3ea5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:32Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://41dd85f86bbd0f48f548ca7e54e66882a8834f9fa2b28bfe0c4e772d35ba590c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T09:01:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-26mbm\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-wjb45\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.303856 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e8af587c-3589-43ca-800d-f908c8e18cbb\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b4e25d9212127873b94032442688e1a2a7a9d5fb5782049ca7831da0f0c54f01\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a3701cabfe91527b7e96ee7c2d51102f501a424f5cebe8a2dd79b78ca551bdb4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bhbjv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:38Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-42cll\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.327078 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d39b209e-3de5-42db-af9d-252a52e840e9\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://63531b7bc0850111d2c063bf9b8221d9a810c687f5c94331c5c39f9bfbaf82cd\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://25d24fa5cbc33d6590454eac2c8a6004a0eee594fda24a9ef74109cc0b3f1b60\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90a9b41b05e94e502a57fd32afbdc9cb8e427f49d4db136841413c97cb5b00ce\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://8fcac373c18e3af0696bbebfc136b8532c501e69e00b266d7906e2c96d10893b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://98b1270566f37f3d2d5ffc2c85914a66224c786f45075e1e32f90cc02401f3b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://815fac2854d5f232ca37bdd2fca3e6116b1c598c1686618f96b4363b38a37b20\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://198777f494880811a94219febe1f8428d8616f014bfa3e68c3876b1e6074f7eb\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3588c45f0d5db5cd0dd86bdedc41f9418596c708f098a07344c929b3bca836c5\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.345154 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"382f857a-419b-4239-98bd-5f96a093f2cd\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T08:59:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T09:00:39Z\\\",\\\"message\\\":\\\"W0320 09:00:38.738976 1 cmd.go:257] Using insecure, self-signed certificates\\\\nI0320 09:00:38.739469 1 crypto.go:601] Generating new CA for check-endpoints-signer@1773997238 cert, and key in /tmp/serving-cert-3854390526/serving-signer.crt, /tmp/serving-cert-3854390526/serving-signer.key\\\\nI0320 09:00:39.077415 1 observer_polling.go:159] Starting file observer\\\\nW0320 09:00:39.081163 1 builder.go:272] unable to get owner reference (falling back to namespace): Get \\\\\\\"https://localhost:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\nI0320 09:00:39.081361 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 09:00:39.081964 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3854390526/tls.crt::/tmp/serving-cert-3854390526/tls.key\\\\\\\"\\\\nF0320 09:00:39.263355 1 cmd.go:182] error initializing delegating authentication: unable to load configmap based request-header-client-ca-file: Get \\\\\\\"https://localhost:6443/api/v1/namespaces/kube-system/configmaps/extension-apiserver-authentication\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:00:39Z is after 2026-02-23T05:33:16Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:00:38Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:28Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T08:59:43Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T08:59:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T08:59:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T08:59:40Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.359165 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:06Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d96599ad93a5571f80a21b4ff5baaac5076346fef56d9e72ccb5dbf193f77952\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:05Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.372334 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:08Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://af08682d2fd3aa22de2e465821125d4214b0c8c685efb488c5d1ab7d0f52cf16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:08Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.384733 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-p2twx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"fd744235-23b7-408d-958b-90a9219c6fd1\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:25Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e995f5fd79ec00f61b652d4a9467917559c8e719ec41e38dad324a6de131611d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-w76jx\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:25Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-p2twx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.397713 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-trr7n" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"14288bf2-b6fe-4961-ad00-a39f76ff1a78\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:39Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xdpxg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:39Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-trr7n\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.415014 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.432002 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:05Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.434289 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.434339 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.434512 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:15 crc kubenswrapper[4958]: E0320 09:02:15.434511 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:15 crc kubenswrapper[4958]: E0320 09:02:15.434723 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:15 crc kubenswrapper[4958]: E0320 09:02:15.435151 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.447817 4958 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-lht4x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"1479666a-d3f9-47dc-aa36-45cc7425d7ee\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:01:26Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T09:02:15Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T09:02:14Z\\\",\\\"message\\\":\\\"2026-03-20T09:01:29+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_66e2a0af-1497-4a21-834f-ef5424cfe97d\\\\n2026-03-20T09:01:29+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_66e2a0af-1497-4a21-834f-ef5424cfe97d to /host/opt/cni/bin/\\\\n2026-03-20T09:01:29Z [verbose] multus-daemon started\\\\n2026-03-20T09:01:29Z [verbose] Readiness Indicator file check\\\\n2026-03-20T09:02:14Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T09:01:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zxv5q\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T09:01:26Z\\\"}}\" for pod \"openshift-multus\"/\"multus-lht4x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T09:02:15Z is after 2025-08-24T17:21:41Z" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.470560 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-25jgh" podStartSLOduration=83.470539713 podStartE2EDuration="1m23.470539713s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:15.470207253 +0000 UTC m=+155.792223211" watchObservedRunningTime="2026-03-20 09:02:15.470539713 +0000 UTC m=+155.792555671" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.516939 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=11.516913042 podStartE2EDuration="11.516913042s" podCreationTimestamp="2026-03-20 09:02:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:15.501710188 +0000 UTC m=+155.823726166" watchObservedRunningTime="2026-03-20 09:02:15.516913042 +0000 UTC m=+155.838929000" Mar 20 09:02:15 crc kubenswrapper[4958]: I0320 09:02:15.534140 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=20.534114169 podStartE2EDuration="20.534114169s" podCreationTimestamp="2026-03-20 09:01:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:15.517564282 +0000 UTC m=+155.839580240" watchObservedRunningTime="2026-03-20 09:02:15.534114169 +0000 UTC m=+155.856130147" Mar 20 09:02:15 crc kubenswrapper[4958]: E0320 09:02:15.568146 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.246146 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lht4x_1479666a-d3f9-47dc-aa36-45cc7425d7ee/kube-multus/0.log" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.246215 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lht4x" event={"ID":"1479666a-d3f9-47dc-aa36-45cc7425d7ee","Type":"ContainerStarted","Data":"c1fa38ee671c6c3b38ada148c663ec96fd3a75dee770fb81c797ad6fa7b1b033"} Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.287728 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=61.287697046 podStartE2EDuration="1m1.287697046s" podCreationTimestamp="2026-03-20 09:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.287399967 +0000 UTC m=+156.609415925" watchObservedRunningTime="2026-03-20 09:02:16.287697046 +0000 UTC m=+156.609713004" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.288201 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-42cll" podStartSLOduration=83.28819244 podStartE2EDuration="1m23.28819244s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.260756491 +0000 UTC m=+156.582772489" watchObservedRunningTime="2026-03-20 09:02:16.28819244 +0000 UTC m=+156.610208398" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.302243 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=61.302204699 podStartE2EDuration="1m1.302204699s" podCreationTimestamp="2026-03-20 09:01:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.301674763 +0000 UTC m=+156.623690721" watchObservedRunningTime="2026-03-20 09:02:16.302204699 +0000 UTC m=+156.624220657" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.344628 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-p2twx" podStartSLOduration=84.344584646 podStartE2EDuration="1m24.344584646s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.34340912 +0000 UTC m=+156.665425078" watchObservedRunningTime="2026-03-20 09:02:16.344584646 +0000 UTC m=+156.666600604" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.377435 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-wjb45" podStartSLOduration=84.37741006 podStartE2EDuration="1m24.37741006s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.376937046 +0000 UTC m=+156.698953004" watchObservedRunningTime="2026-03-20 09:02:16.37741006 +0000 UTC m=+156.699426018" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.377583 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podStartSLOduration=84.377578295 podStartE2EDuration="1m24.377578295s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.361021139 +0000 UTC m=+156.683037107" watchObservedRunningTime="2026-03-20 09:02:16.377578295 +0000 UTC m=+156.699594253" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.423407 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lht4x" podStartSLOduration=84.423382227 podStartE2EDuration="1m24.423382227s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:16.422985295 +0000 UTC m=+156.745001273" watchObservedRunningTime="2026-03-20 09:02:16.423382227 +0000 UTC m=+156.745398185" Mar 20 09:02:16 crc kubenswrapper[4958]: I0320 09:02:16.434079 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:16 crc kubenswrapper[4958]: E0320 09:02:16.434223 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:17 crc kubenswrapper[4958]: I0320 09:02:17.434678 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:17 crc kubenswrapper[4958]: I0320 09:02:17.434845 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:17 crc kubenswrapper[4958]: E0320 09:02:17.434873 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:17 crc kubenswrapper[4958]: E0320 09:02:17.435066 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:17 crc kubenswrapper[4958]: I0320 09:02:17.434725 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:17 crc kubenswrapper[4958]: E0320 09:02:17.435721 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:18 crc kubenswrapper[4958]: I0320 09:02:18.433897 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:18 crc kubenswrapper[4958]: E0320 09:02:18.434067 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:19 crc kubenswrapper[4958]: I0320 09:02:19.434453 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:19 crc kubenswrapper[4958]: I0320 09:02:19.434453 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:19 crc kubenswrapper[4958]: E0320 09:02:19.434663 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:19 crc kubenswrapper[4958]: I0320 09:02:19.434839 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:19 crc kubenswrapper[4958]: E0320 09:02:19.434949 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:19 crc kubenswrapper[4958]: E0320 09:02:19.435086 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.434299 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:20 crc kubenswrapper[4958]: E0320 09:02:20.435179 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:20 crc kubenswrapper[4958]: E0320 09:02:20.569776 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.594720 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.594788 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.594801 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.594824 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.594839 4958 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T09:02:20Z","lastTransitionTime":"2026-03-20T09:02:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.642230 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl"] Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.643138 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.645346 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.646131 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.646189 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.646836 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.791203 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ea858e96-0614-4ea4-928d-4dde0df4faf2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.791279 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ea858e96-0614-4ea4-928d-4dde0df4faf2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.791380 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea858e96-0614-4ea4-928d-4dde0df4faf2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.791557 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea858e96-0614-4ea4-928d-4dde0df4faf2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.791687 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea858e96-0614-4ea4-928d-4dde0df4faf2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea858e96-0614-4ea4-928d-4dde0df4faf2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893157 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ea858e96-0614-4ea4-928d-4dde0df4faf2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893217 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ea858e96-0614-4ea4-928d-4dde0df4faf2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893257 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea858e96-0614-4ea4-928d-4dde0df4faf2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893305 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea858e96-0614-4ea4-928d-4dde0df4faf2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/ea858e96-0614-4ea4-928d-4dde0df4faf2-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.893378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/ea858e96-0614-4ea4-928d-4dde0df4faf2-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.894078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ea858e96-0614-4ea4-928d-4dde0df4faf2-service-ca\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.902364 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea858e96-0614-4ea4-928d-4dde0df4faf2-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.911799 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ea858e96-0614-4ea4-928d-4dde0df4faf2-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-l6nsl\" (UID: \"ea858e96-0614-4ea4-928d-4dde0df4faf2\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:20 crc kubenswrapper[4958]: I0320 09:02:20.961997 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.265983 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" event={"ID":"ea858e96-0614-4ea4-928d-4dde0df4faf2","Type":"ContainerStarted","Data":"ef003df76949b755251a50baead7b3e023dca84aed82d58d227e30ae688efda6"} Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.266868 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" event={"ID":"ea858e96-0614-4ea4-928d-4dde0df4faf2","Type":"ContainerStarted","Data":"2cda9999fa29407a1174b3527b033c8330949d59b4a836bd131beeb0841b3671"} Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.284457 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-l6nsl" podStartSLOduration=89.284428289 podStartE2EDuration="1m29.284428289s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:21.28382856 +0000 UTC m=+161.605844508" watchObservedRunningTime="2026-03-20 09:02:21.284428289 +0000 UTC m=+161.606444247" Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.434878 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.435033 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:21 crc kubenswrapper[4958]: E0320 09:02:21.435098 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:21 crc kubenswrapper[4958]: E0320 09:02:21.435233 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.435749 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:21 crc kubenswrapper[4958]: E0320 09:02:21.436008 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.442931 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 09:02:21 crc kubenswrapper[4958]: I0320 09:02:21.453567 4958 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 09:02:22 crc kubenswrapper[4958]: I0320 09:02:22.434553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:22 crc kubenswrapper[4958]: E0320 09:02:22.434814 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:23 crc kubenswrapper[4958]: I0320 09:02:23.434071 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:23 crc kubenswrapper[4958]: I0320 09:02:23.434143 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:23 crc kubenswrapper[4958]: I0320 09:02:23.434107 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:23 crc kubenswrapper[4958]: E0320 09:02:23.434502 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:23 crc kubenswrapper[4958]: E0320 09:02:23.434675 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:23 crc kubenswrapper[4958]: E0320 09:02:23.434853 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:23 crc kubenswrapper[4958]: I0320 09:02:23.436059 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:02:24 crc kubenswrapper[4958]: I0320 09:02:24.279285 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/2.log" Mar 20 09:02:24 crc kubenswrapper[4958]: I0320 09:02:24.282142 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerStarted","Data":"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9"} Mar 20 09:02:24 crc kubenswrapper[4958]: I0320 09:02:24.282764 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:02:24 crc kubenswrapper[4958]: I0320 09:02:24.317959 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podStartSLOduration=92.317936043 podStartE2EDuration="1m32.317936043s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:24.317446488 +0000 UTC m=+164.639462456" watchObservedRunningTime="2026-03-20 09:02:24.317936043 +0000 UTC m=+164.639952001" Mar 20 09:02:24 crc kubenswrapper[4958]: I0320 09:02:24.434147 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:24 crc kubenswrapper[4958]: E0320 09:02:24.434318 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:24 crc kubenswrapper[4958]: I0320 09:02:24.575461 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-trr7n"] Mar 20 09:02:25 crc kubenswrapper[4958]: I0320 09:02:25.285010 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:25 crc kubenswrapper[4958]: E0320 09:02:25.285143 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:25 crc kubenswrapper[4958]: I0320 09:02:25.434493 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:25 crc kubenswrapper[4958]: I0320 09:02:25.434494 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:25 crc kubenswrapper[4958]: I0320 09:02:25.434668 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:25 crc kubenswrapper[4958]: E0320 09:02:25.434752 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:25 crc kubenswrapper[4958]: E0320 09:02:25.435203 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:25 crc kubenswrapper[4958]: E0320 09:02:25.435300 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:25 crc kubenswrapper[4958]: E0320 09:02:25.571274 4958 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 09:02:27 crc kubenswrapper[4958]: I0320 09:02:27.434334 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:27 crc kubenswrapper[4958]: I0320 09:02:27.434348 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:27 crc kubenswrapper[4958]: E0320 09:02:27.434978 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:27 crc kubenswrapper[4958]: I0320 09:02:27.434388 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:27 crc kubenswrapper[4958]: I0320 09:02:27.434301 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:27 crc kubenswrapper[4958]: E0320 09:02:27.435141 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:27 crc kubenswrapper[4958]: E0320 09:02:27.435326 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:27 crc kubenswrapper[4958]: E0320 09:02:27.435487 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:29 crc kubenswrapper[4958]: I0320 09:02:29.434618 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:29 crc kubenswrapper[4958]: I0320 09:02:29.434668 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:29 crc kubenswrapper[4958]: I0320 09:02:29.434894 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:29 crc kubenswrapper[4958]: I0320 09:02:29.434913 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:29 crc kubenswrapper[4958]: E0320 09:02:29.435002 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 09:02:29 crc kubenswrapper[4958]: E0320 09:02:29.435143 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-trr7n" podUID="14288bf2-b6fe-4961-ad00-a39f76ff1a78" Mar 20 09:02:29 crc kubenswrapper[4958]: E0320 09:02:29.435284 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 09:02:29 crc kubenswrapper[4958]: E0320 09:02:29.435305 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 09:02:29 crc kubenswrapper[4958]: I0320 09:02:29.452181 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.928193 4958 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.983653 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=1.983624279 podStartE2EDuration="1.983624279s" podCreationTimestamp="2026-03-20 09:02:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:30.45284633 +0000 UTC m=+170.774862298" watchObservedRunningTime="2026-03-20 09:02:30.983624279 +0000 UTC m=+171.305640247" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.985729 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lv6ph"] Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.986651 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9"] Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.987017 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.987129 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.989184 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.995194 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.995380 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.995490 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.995768 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jnh4"] Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.996139 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sczfm"] Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.996391 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.996654 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.996808 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6j2mb"] Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.997324 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:30 crc kubenswrapper[4958]: I0320 09:02:30.997392 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.002200 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.002409 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.002524 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.007621 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.012111 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.012417 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.014101 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031169 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031254 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031475 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031507 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zc29j\" (UniqueName: \"kubernetes.io/projected/500ed42c-e31d-40ac-90c5-3c4a4184a109-kube-api-access-zc29j\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031716 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031772 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031801 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhlqk\" (UniqueName: \"kubernetes.io/projected/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-kube-api-access-bhlqk\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031825 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbe16922-1799-410a-bf9f-56b3818a7e94-node-pullsecrets\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031884 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.031967 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbe16922-1799-410a-bf9f-56b3818a7e94-audit-dir\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032001 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032026 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj2wq\" (UniqueName: \"kubernetes.io/projected/0abda610-306f-48a3-b854-402ed122541d-kube-api-access-vj2wq\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032063 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032086 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032114 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032139 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g66fx\" (UniqueName: \"kubernetes.io/projected/bbe16922-1799-410a-bf9f-56b3818a7e94-kube-api-access-g66fx\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032213 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032237 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-image-import-ca\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032262 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032291 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500ed42c-e31d-40ac-90c5-3c4a4184a109-serving-cert\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032314 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-etcd-serving-ca\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032427 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-config\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032466 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-audit\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032488 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-config\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032546 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k7mmp\" (UniqueName: \"kubernetes.io/projected/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-kube-api-access-k7mmp\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032575 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-etcd-client\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032618 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-client-ca\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032644 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-policies\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032668 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032724 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abda610-306f-48a3-b854-402ed122541d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032748 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-dir\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032770 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0abda610-306f-48a3-b854-402ed122541d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032790 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-serving-cert\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032831 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-encryption-config\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032854 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-serving-cert\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032880 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032909 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032931 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.032962 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-config\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034049 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034265 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-hrxfl"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034381 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034748 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034785 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034791 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.034835 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.035120 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.036106 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.036450 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-x265h"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.036822 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.036916 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fj78w"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.037400 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.037463 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.037810 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.045688 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nqfn6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.046981 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wxtz6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.047145 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.047676 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9vnqx"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.047761 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.048412 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.053285 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.053386 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.053472 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.053691 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.053702 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.053869 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.054028 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.054039 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.054143 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.054097 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.054258 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055179 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055284 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055379 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055458 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055572 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055621 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055728 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.055797 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.058756 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.058997 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.060015 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.063241 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.063985 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.064774 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.066476 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.075515 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-flhr9"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.076495 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.079083 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.079922 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.080230 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.081100 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.083428 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.097987 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.105660 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.106704 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.108017 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.109045 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.110377 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.121487 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gksr4"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.122245 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.122625 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.123796 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.124340 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.124484 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.124800 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.124980 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.125116 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.125378 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.125533 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.125959 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.125984 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126044 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126144 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126293 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126304 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126638 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126739 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gwpt"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126459 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126496 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.126515 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.127307 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.127553 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.127755 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.127952 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.128080 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.127776 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.128424 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.128656 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.125972 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.128813 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.128947 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129088 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129226 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129266 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.128750 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129496 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129098 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129730 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129153 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129866 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129457 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129832 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.129657 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.130136 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.130155 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.131379 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.132124 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.132349 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.132565 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.132737 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.133018 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.133361 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.133678 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.134337 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.135096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.135361 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.135535 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.135709 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.136693 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137031 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137359 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137456 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137625 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137815 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137869 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.137997 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.138590 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.140490 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.141984 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143498 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbe16922-1799-410a-bf9f-56b3818a7e94-audit-dir\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143582 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143629 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vj2wq\" (UniqueName: \"kubernetes.io/projected/0abda610-306f-48a3-b854-402ed122541d-kube-api-access-vj2wq\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143664 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143688 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143738 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143769 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqdnm\" (UniqueName: \"kubernetes.io/projected/e0046d0b-d22b-4637-96c5-c9dfe397ebe7-kube-api-access-mqdnm\") pod \"dns-operator-744455d44c-nqfn6\" (UID: \"e0046d0b-d22b-4637-96c5-c9dfe397ebe7\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143800 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-serving-cert\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143848 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g66fx\" (UniqueName: \"kubernetes.io/projected/bbe16922-1799-410a-bf9f-56b3818a7e94-kube-api-access-g66fx\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143877 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xww6b\" (UniqueName: \"kubernetes.io/projected/0051c7c2-c695-478a-b746-554f8c649495-kube-api-access-xww6b\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143909 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143929 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-trusted-ca-bundle\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143952 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-config\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143972 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vfsr5\" (UniqueName: \"kubernetes.io/projected/f0d46cc6-8881-4edc-b186-4388a3ced86b-kube-api-access-vfsr5\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.143974 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.144165 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.144226 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bbe16922-1799-410a-bf9f-56b3818a7e94-audit-dir\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.145148 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-service-ca-bundle\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.146469 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.149394 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.150826 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.153315 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.154810 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.156036 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-image-import-ca\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.160264 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162187 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162589 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0d46cc6-8881-4edc-b186-4388a3ced86b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162681 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162710 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-oauth-config\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162737 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500ed42c-e31d-40ac-90c5-3c4a4184a109-serving-cert\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162763 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-etcd-serving-ca\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162818 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-console-config\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162843 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-config\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162869 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-oauth-serving-cert\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162891 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-etcd-client\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162927 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-audit\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162952 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-config\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.162991 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k7mmp\" (UniqueName: \"kubernetes.io/projected/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-kube-api-access-k7mmp\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163010 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-etcd-client\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163031 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-client-ca\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163059 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-policies\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163086 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abda610-306f-48a3-b854-402ed122541d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163112 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163135 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-service-ca\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163163 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-serving-cert\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163185 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163205 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-encryption-config\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163226 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-serving-cert\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163254 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-dir\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163277 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0abda610-306f-48a3-b854-402ed122541d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163310 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163330 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163349 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163383 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163403 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163423 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-config\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163443 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d46cc6-8881-4edc-b186-4388a3ced86b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163474 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0051c7c2-c695-478a-b746-554f8c649495-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163524 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-audit-policies\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163543 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6gwdx\" (UniqueName: \"kubernetes.io/projected/460baf6e-b4fd-4f68-804b-86d4767241d1-kube-api-access-6gwdx\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163562 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0051c7c2-c695-478a-b746-554f8c649495-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163581 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-encryption-config\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163616 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b57755bc-b4cd-4b4f-b040-381c0e98b166-audit-dir\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163638 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163656 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zc29j\" (UniqueName: \"kubernetes.io/projected/500ed42c-e31d-40ac-90c5-3c4a4184a109-kube-api-access-zc29j\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163672 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163689 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9rg9\" (UniqueName: \"kubernetes.io/projected/b57755bc-b4cd-4b4f-b040-381c0e98b166-kube-api-access-d9rg9\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163720 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163738 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-serving-cert\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163758 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163779 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163798 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhlqk\" (UniqueName: \"kubernetes.io/projected/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-kube-api-access-bhlqk\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163834 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbe16922-1799-410a-bf9f-56b3818a7e94-node-pullsecrets\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.163863 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0046d0b-d22b-4637-96c5-c9dfe397ebe7-metrics-tls\") pod \"dns-operator-744455d44c-nqfn6\" (UID: \"e0046d0b-d22b-4637-96c5-c9dfe397ebe7\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.164546 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-dir\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.165081 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-image-import-ca\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.165750 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.166385 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-config\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.166454 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.166678 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.166870 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-etcd-serving-ca\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.167185 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/bbe16922-1799-410a-bf9f-56b3818a7e94-node-pullsecrets\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.167190 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ncgcc"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.180527 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/500ed42c-e31d-40ac-90c5-3c4a4184a109-serving-cert\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.181825 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.186131 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.186417 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-client-ca\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.186494 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-audit\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.187853 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-encryption-config\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.189045 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.189358 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-policies\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.190667 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.191109 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-serving-cert\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.191305 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/500ed42c-e31d-40ac-90c5-3c4a4184a109-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.191359 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-trusted-ca-bundle\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.191790 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0abda610-306f-48a3-b854-402ed122541d-config\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.191864 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0abda610-306f-48a3-b854-402ed122541d-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.192032 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-config\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.192617 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.192676 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.193488 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbe16922-1799-410a-bf9f-56b3818a7e94-config\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.194261 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.194498 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.194493 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.194932 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bbe16922-1799-410a-bf9f-56b3818a7e94-etcd-client\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.197994 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.198022 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-7qnx6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.198173 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.199392 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.199494 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.200564 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.202314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-serving-cert\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.207954 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cx5r7"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.226070 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.229964 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.230063 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.230310 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.231842 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.232518 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-xpvqq"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.234129 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566622-xd9xt"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.234301 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.234660 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.234977 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.236177 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.236309 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.237120 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.237427 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.238015 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.238713 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.243139 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lv6ph"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.243183 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.243208 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-krxrr"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.243269 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.244659 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.244991 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jnh4"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.246180 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.247945 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hrxfl"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.248570 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sczfm"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.249342 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.250643 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fj78w"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.251731 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-flhr9"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.252876 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.254068 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nqfn6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.255186 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9vnqx"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.256368 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.257447 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.258711 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-xd9xt"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.259752 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6j2mb"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.260704 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ncgcc"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.261824 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.263254 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.264880 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-trusted-ca-bundle\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.264914 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-config\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.264961 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vfsr5\" (UniqueName: \"kubernetes.io/projected/f0d46cc6-8881-4edc-b186-4388a3ced86b-kube-api-access-vfsr5\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.264985 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-oauth-config\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.265023 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0d46cc6-8881-4edc-b186-4388a3ced86b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.265055 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-console-config\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.265073 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-oauth-serving-cert\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266009 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f0d46cc6-8881-4edc-b186-4388a3ced86b-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266052 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gksr4"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266073 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-etcd-client\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266084 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-trusted-ca-bundle\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266128 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-service-ca\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266156 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266174 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266199 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d46cc6-8881-4edc-b186-4388a3ced86b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0051c7c2-c695-478a-b746-554f8c649495-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266240 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-audit-policies\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266266 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6gwdx\" (UniqueName: \"kubernetes.io/projected/460baf6e-b4fd-4f68-804b-86d4767241d1-kube-api-access-6gwdx\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266285 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0051c7c2-c695-478a-b746-554f8c649495-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266304 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-encryption-config\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266320 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b57755bc-b4cd-4b4f-b040-381c0e98b166-audit-dir\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266338 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-console-config\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-serving-cert\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.266952 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-oauth-serving-cert\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267081 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-audit-policies\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267127 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267155 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9rg9\" (UniqueName: \"kubernetes.io/projected/b57755bc-b4cd-4b4f-b040-381c0e98b166-kube-api-access-d9rg9\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267664 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wxtz6"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267693 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b57755bc-b4cd-4b4f-b040-381c0e98b166-audit-dir\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267855 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267888 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0046d0b-d22b-4637-96c5-c9dfe397ebe7-metrics-tls\") pod \"dns-operator-744455d44c-nqfn6\" (UID: \"e0046d0b-d22b-4637-96c5-c9dfe397ebe7\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267965 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xww6b\" (UniqueName: \"kubernetes.io/projected/0051c7c2-c695-478a-b746-554f8c649495-kube-api-access-xww6b\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.267995 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqdnm\" (UniqueName: \"kubernetes.io/projected/e0046d0b-d22b-4637-96c5-c9dfe397ebe7-kube-api-access-mqdnm\") pod \"dns-operator-744455d44c-nqfn6\" (UID: \"e0046d0b-d22b-4637-96c5-c9dfe397ebe7\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.268017 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-serving-cert\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.268106 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.268495 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b57755bc-b4cd-4b4f-b040-381c0e98b166-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.268712 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.269065 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-service-ca\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.270251 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.270914 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-serving-cert\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.271026 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-encryption-config\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.271087 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-serving-cert\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.271324 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gwpt"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.271752 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.271816 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f0d46cc6-8881-4edc-b186-4388a3ced86b-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.272469 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-oauth-config\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.273047 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b57755bc-b4cd-4b4f-b040-381c0e98b166-etcd-client\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.273687 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e0046d0b-d22b-4637-96c5-c9dfe397ebe7-metrics-tls\") pod \"dns-operator-744455d44c-nqfn6\" (UID: \"e0046d0b-d22b-4637-96c5-c9dfe397ebe7\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.274478 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.275605 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.276824 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.277887 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-4nbh2"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.279018 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.279156 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.280153 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.281218 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.282591 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.284195 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.286514 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.288336 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.289879 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.290502 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.291809 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-krxrr"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.293868 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.295538 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cx5r7"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.297143 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4nbh2"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.298270 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-jn96f"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.301416 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.306057 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xpvqq"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.308171 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bphsz"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.313809 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.314111 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.314151 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bphsz"] Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.323253 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.329774 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.349457 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.358077 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-config\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.369323 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.389519 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.398318 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/0051c7c2-c695-478a-b746-554f8c649495-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.409547 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.428991 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.433865 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.433872 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.433875 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.433880 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.450268 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.461858 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0051c7c2-c695-478a-b746-554f8c649495-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.489870 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.510499 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.530055 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.550058 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.570935 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.590813 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.609308 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.630877 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.650014 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.670247 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.689918 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.709371 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.729059 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.749336 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.770196 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.789531 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.817411 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.830557 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.850300 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.869842 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.889406 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.910390 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.939696 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.949132 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 09:02:31 crc kubenswrapper[4958]: I0320 09:02:31.990108 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.009561 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.029721 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.049317 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.069935 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.089752 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.110453 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.130048 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.148330 4958 request.go:700] Waited for 1.002429501s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-apiserver/serviceaccounts/openshift-apiserver-sa/token Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.179316 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g66fx\" (UniqueName: \"kubernetes.io/projected/bbe16922-1799-410a-bf9f-56b3818a7e94-kube-api-access-g66fx\") pod \"apiserver-76f77b778f-lv6ph\" (UID: \"bbe16922-1799-410a-bf9f-56b3818a7e94\") " pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.191026 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.204961 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj2wq\" (UniqueName: \"kubernetes.io/projected/0abda610-306f-48a3-b854-402ed122541d-kube-api-access-vj2wq\") pod \"openshift-apiserver-operator-796bbdcf4f-rklt9\" (UID: \"0abda610-306f-48a3-b854-402ed122541d\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.210340 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.243187 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.254173 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.260402 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k7mmp\" (UniqueName: \"kubernetes.io/projected/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-kube-api-access-k7mmp\") pod \"controller-manager-879f6c89f-sczfm\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.266221 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhlqk\" (UniqueName: \"kubernetes.io/projected/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-kube-api-access-bhlqk\") pod \"oauth-openshift-558db77b4-6j2mb\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.271224 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.290652 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.291011 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zc29j\" (UniqueName: \"kubernetes.io/projected/500ed42c-e31d-40ac-90c5-3c4a4184a109-kube-api-access-zc29j\") pod \"authentication-operator-69f744f599-4jnh4\" (UID: \"500ed42c-e31d-40ac-90c5-3c4a4184a109\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.302893 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.309957 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.330271 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.337169 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.351374 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.370410 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.392143 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.411250 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.436450 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.454921 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.472991 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.491732 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.510626 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.530226 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.550394 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.569874 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.590169 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.610814 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.617199 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6j2mb"] Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.618328 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-4jnh4"] Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.629244 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.650181 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.669871 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.689362 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.709458 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.729292 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.748837 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.769449 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.789643 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.805703 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sczfm"] Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.807180 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9"] Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.810618 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.813461 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-lv6ph"] Mar 20 09:02:32 crc kubenswrapper[4958]: W0320 09:02:32.828719 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0abda610_306f_48a3_b854_402ed122541d.slice/crio-7b1006eec499b09742f0e6c16f2a128d8483a505caeeffbbec92b7f6681a37f7 WatchSource:0}: Error finding container 7b1006eec499b09742f0e6c16f2a128d8483a505caeeffbbec92b7f6681a37f7: Status 404 returned error can't find the container with id 7b1006eec499b09742f0e6c16f2a128d8483a505caeeffbbec92b7f6681a37f7 Mar 20 09:02:32 crc kubenswrapper[4958]: W0320 09:02:32.829056 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbe16922_1799_410a_bf9f_56b3818a7e94.slice/crio-3d3f4f1c512235008fcb66f2559665b486d7172a9df3a5bfe40afa63872c89b7 WatchSource:0}: Error finding container 3d3f4f1c512235008fcb66f2559665b486d7172a9df3a5bfe40afa63872c89b7: Status 404 returned error can't find the container with id 3d3f4f1c512235008fcb66f2559665b486d7172a9df3a5bfe40afa63872c89b7 Mar 20 09:02:32 crc kubenswrapper[4958]: W0320 09:02:32.830086 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f2a1ac8_4fa6_424c_a37e_9d8ad771c063.slice/crio-cc965271203e564d398853d84ef80c1fcea51317b7814c981669241ac5920001 WatchSource:0}: Error finding container cc965271203e564d398853d84ef80c1fcea51317b7814c981669241ac5920001: Status 404 returned error can't find the container with id cc965271203e564d398853d84ef80c1fcea51317b7814c981669241ac5920001 Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.830257 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.850714 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.869547 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.890485 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.909751 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.929856 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.950497 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 09:02:32 crc kubenswrapper[4958]: I0320 09:02:32.969299 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.006108 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vfsr5\" (UniqueName: \"kubernetes.io/projected/f0d46cc6-8881-4edc-b186-4388a3ced86b-kube-api-access-vfsr5\") pod \"openshift-controller-manager-operator-756b6f6bc6-rfb76\" (UID: \"f0d46cc6-8881-4edc-b186-4388a3ced86b\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.023129 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/4bcf0d08-8af2-46b0-9695-bd37f4bee24b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-z6ghj\" (UID: \"4bcf0d08-8af2-46b0-9695-bd37f4bee24b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.045079 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6gwdx\" (UniqueName: \"kubernetes.io/projected/460baf6e-b4fd-4f68-804b-86d4767241d1-kube-api-access-6gwdx\") pod \"console-f9d7485db-hrxfl\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.079550 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9rg9\" (UniqueName: \"kubernetes.io/projected/b57755bc-b4cd-4b4f-b040-381c0e98b166-kube-api-access-d9rg9\") pod \"apiserver-7bbb656c7d-6wbvm\" (UID: \"b57755bc-b4cd-4b4f-b040-381c0e98b166\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.083264 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xww6b\" (UniqueName: \"kubernetes.io/projected/0051c7c2-c695-478a-b746-554f8c649495-kube-api-access-xww6b\") pod \"kube-storage-version-migrator-operator-b67b599dd-f8fn6\" (UID: \"0051c7c2-c695-478a-b746-554f8c649495\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.106160 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqdnm\" (UniqueName: \"kubernetes.io/projected/e0046d0b-d22b-4637-96c5-c9dfe397ebe7-kube-api-access-mqdnm\") pod \"dns-operator-744455d44c-nqfn6\" (UID: \"e0046d0b-d22b-4637-96c5-c9dfe397ebe7\") " pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.109485 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.114826 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.130297 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.141763 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.154265 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.160498 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.168486 4958 request.go:700] Waited for 1.888950074s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-ingress-canary/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&limit=500&resourceVersion=0 Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.170848 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.190321 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.213315 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.232355 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.243523 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.249484 4958 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.273080 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.290338 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.310512 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.324207 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.329881 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.337573 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" event={"ID":"500ed42c-e31d-40ac-90c5-3c4a4184a109","Type":"ContainerStarted","Data":"df632fbec8d1c6780ad3486b1ca8bb52130217ae3878d81b3e302dd8093dffeb"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.337645 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" event={"ID":"500ed42c-e31d-40ac-90c5-3c4a4184a109","Type":"ContainerStarted","Data":"0e101abdae1452bc4757fb5a0e4a7e543b7b378c253bec0e0aee13f8cdebbc7d"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.340301 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" event={"ID":"0abda610-306f-48a3-b854-402ed122541d","Type":"ContainerStarted","Data":"8f25af25dc9dc0be39faf676f7744038ec2a966fd0ecbb161cf495e36a22701c"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.340356 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" event={"ID":"0abda610-306f-48a3-b854-402ed122541d","Type":"ContainerStarted","Data":"7b1006eec499b09742f0e6c16f2a128d8483a505caeeffbbec92b7f6681a37f7"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.342116 4958 generic.go:334] "Generic (PLEG): container finished" podID="bbe16922-1799-410a-bf9f-56b3818a7e94" containerID="ccbb726bbc3a7c3fa26a25dbfe2fec5ff83c07a27c563a2f02f1f75593929c90" exitCode=0 Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.342191 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" event={"ID":"bbe16922-1799-410a-bf9f-56b3818a7e94","Type":"ContainerDied","Data":"ccbb726bbc3a7c3fa26a25dbfe2fec5ff83c07a27c563a2f02f1f75593929c90"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.342214 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" event={"ID":"bbe16922-1799-410a-bf9f-56b3818a7e94","Type":"ContainerStarted","Data":"3d3f4f1c512235008fcb66f2559665b486d7172a9df3a5bfe40afa63872c89b7"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.343562 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" event={"ID":"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063","Type":"ContainerStarted","Data":"9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.343633 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" event={"ID":"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063","Type":"ContainerStarted","Data":"cc965271203e564d398853d84ef80c1fcea51317b7814c981669241ac5920001"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.344543 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.346137 4958 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-sczfm container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.346180 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" podUID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.8:8443/healthz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.349433 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" event={"ID":"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e","Type":"ContainerStarted","Data":"f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.349479 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" event={"ID":"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e","Type":"ContainerStarted","Data":"f3bbccf610e430818a627a78b7f394ad16c2c315cb78e6cf618e29d64caaed1d"} Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.350339 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.350372 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.351683 4958 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-6j2mb container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" start-of-body= Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.351745 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.18:6443/healthz\": dial tcp 10.217.0.18:6443: connect: connection refused" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.364966 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76"] Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.373259 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.381204 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.393765 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.395018 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6"] Mar 20 09:02:33 crc kubenswrapper[4958]: W0320 09:02:33.406152 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0051c7c2_c695_478a_b746_554f8c649495.slice/crio-1f88be1e980186073d74b0450c4a5610bd779680c06d5e92eec737daee57db19 WatchSource:0}: Error finding container 1f88be1e980186073d74b0450c4a5610bd779680c06d5e92eec737daee57db19: Status 404 returned error can't find the container with id 1f88be1e980186073d74b0450c4a5610bd779680c06d5e92eec737daee57db19 Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.411076 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.433915 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj"] Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.495735 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm"] Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.501845 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e18500-3b0a-40f6-9901-064d35bb4d17-config\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.501902 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-certificates\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.501931 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-bound-sa-token\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.501957 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-machine-approver-tls\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.501981 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44e18500-3b0a-40f6-9901-064d35bb4d17-serving-cert\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502018 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502048 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-images\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502117 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57796\" (UniqueName: \"kubernetes.io/projected/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-kube-api-access-57796\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502144 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-config\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502240 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-config\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502262 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6fl59\" (UniqueName: \"kubernetes.io/projected/44e18500-3b0a-40f6-9901-064d35bb4d17-kube-api-access-6fl59\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502288 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shjjt\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-kube-api-access-shjjt\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502313 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502376 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-auth-proxy-config\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502406 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-client-ca\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502455 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44e18500-3b0a-40f6-9901-064d35bb4d17-trusted-ca\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502505 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502537 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-tls\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502564 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f52d3b-36b5-4d26-a225-d8601c9c565d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502640 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f52d3b-36b5-4d26-a225-d8601c9c565d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502722 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-trusted-ca\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502802 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xsbh4\" (UniqueName: \"kubernetes.io/projected/798e3302-e232-4fe3-81ed-21656b961de4-kube-api-access-xsbh4\") pod \"cluster-samples-operator-665b6dd947-m9ffp\" (UID: \"798e3302-e232-4fe3-81ed-21656b961de4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502870 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f52d3b-36b5-4d26-a225-d8601c9c565d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502905 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-serving-cert\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.502933 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k4zt\" (UniqueName: \"kubernetes.io/projected/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-kube-api-access-4k4zt\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.505686 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.005667134 +0000 UTC m=+174.327683092 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.505755 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ae2a9c-3e18-461f-8fb5-1ead8da14023-serving-cert\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.505808 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rrjd\" (UniqueName: \"kubernetes.io/projected/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-kube-api-access-8rrjd\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.505836 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/798e3302-e232-4fe3-81ed-21656b961de4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m9ffp\" (UID: \"798e3302-e232-4fe3-81ed-21656b961de4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.505858 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddbfg\" (UniqueName: \"kubernetes.io/projected/11ae2a9c-3e18-461f-8fb5-1ead8da14023-kube-api-access-ddbfg\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.506205 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7fc6b17f-3483-409e-aee4-011ce5afd4c2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.506752 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-config\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.506827 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7fc6b17f-3483-409e-aee4-011ce5afd4c2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.608146 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.608462 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.108424338 +0000 UTC m=+174.430440296 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609080 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwrd4\" (UniqueName: \"kubernetes.io/projected/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-kube-api-access-rwrd4\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609139 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44e18500-3b0a-40f6-9901-064d35bb4d17-trusted-ca\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609178 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-config\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609226 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-client\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609248 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-tls\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609297 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f52d3b-36b5-4d26-a225-d8601c9c565d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609318 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609339 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff964b7f-57fd-46ce-a640-e8db42df3acc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609378 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2xjll\" (UniqueName: \"kubernetes.io/projected/3aa3983f-0743-41e7-aefd-241e19c1d520-kube-api-access-2xjll\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609484 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-trusted-ca\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609534 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/910c6e78-aa65-4fb5-81b3-60d842e4376a-signing-cabundle\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609561 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2ac97e8-b2ca-4c64-a495-3d415649acf3-proxy-tls\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609581 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7bwg\" (UniqueName: \"kubernetes.io/projected/030b2b44-7380-480c-a478-0d42a21a6836-kube-api-access-m7bwg\") pod \"migrator-59844c95c7-vjlkv\" (UID: \"030b2b44-7380-480c-a478-0d42a21a6836\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609636 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3aa3983f-0743-41e7-aefd-241e19c1d520-apiservice-cert\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609665 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-metrics-certs\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609681 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-ca\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609745 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9slc6\" (UniqueName: \"kubernetes.io/projected/e2ac97e8-b2ca-4c64-a495-3d415649acf3-kube-api-access-9slc6\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609817 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/3d8f6396-79a0-4009-aab7-8774b4b051ab-kube-api-access-rbf6t\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609839 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xsbh4\" (UniqueName: \"kubernetes.io/projected/798e3302-e232-4fe3-81ed-21656b961de4-kube-api-access-xsbh4\") pod \"cluster-samples-operator-665b6dd947-m9ffp\" (UID: \"798e3302-e232-4fe3-81ed-21656b961de4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609858 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f52d3b-36b5-4d26-a225-d8601c9c565d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609896 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4k4zt\" (UniqueName: \"kubernetes.io/projected/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-kube-api-access-4k4zt\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609917 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.609979 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-serving-cert\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.610004 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddbfg\" (UniqueName: \"kubernetes.io/projected/11ae2a9c-3e18-461f-8fb5-1ead8da14023-kube-api-access-ddbfg\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.610023 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/798e3302-e232-4fe3-81ed-21656b961de4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m9ffp\" (UID: \"798e3302-e232-4fe3-81ed-21656b961de4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.610063 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff964b7f-57fd-46ce-a640-e8db42df3acc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.610091 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zst5v\" (UniqueName: \"kubernetes.io/projected/14a77e92-7924-4527-a6d2-1fb0ad4d9319-kube-api-access-zst5v\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.610109 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deddae37-398e-4667-9e96-f6f8f15998c7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ncgcc\" (UID: \"deddae37-398e-4667-9e96-f6f8f15998c7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.611527 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/29f52d3b-36b5-4d26-a225-d8601c9c565d-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612430 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3aa3983f-0743-41e7-aefd-241e19c1d520-webhook-cert\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612481 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c99f62-f554-43ce-91f9-fff5b7490f6c-cert\") pod \"ingress-canary-4nbh2\" (UID: \"42c99f62-f554-43ce-91f9-fff5b7490f6c\") " pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612507 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62faea4c-22f9-43c6-9edf-76c832d63659-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612585 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612647 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-registration-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612681 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rds2h\" (UniqueName: \"kubernetes.io/projected/375c7798-d728-48b0-ac0d-27ba8f57a393-kube-api-access-rds2h\") pod \"auto-csr-approver-29566622-xd9xt\" (UID: \"375c7798-d728-48b0-ac0d-27ba8f57a393\") " pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612703 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14a77e92-7924-4527-a6d2-1fb0ad4d9319-srv-cert\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.612986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d837a1b-0cc6-494a-9680-76de8c16250e-config-volume\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613016 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rtxc8\" (UniqueName: \"kubernetes.io/projected/d03ebcab-e060-45f2-99ea-fb25179f824c-kube-api-access-rtxc8\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613054 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-trusted-ca\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613080 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-images\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613419 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9xff4\" (UniqueName: \"kubernetes.io/projected/5434e504-53f0-41f5-96bc-1981e69b15ac-kube-api-access-9xff4\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613460 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-socket-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613522 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebe11c99-e14e-4390-8fd6-6638f0c6ad16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v5svb\" (UID: \"ebe11c99-e14e-4390-8fd6-6638f0c6ad16\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613554 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-csi-data-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613610 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2ac97e8-b2ca-4c64-a495-3d415649acf3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613633 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gs8zm\" (UniqueName: \"kubernetes.io/projected/e06e31e8-6210-46ed-99e3-5a0cda45499b-kube-api-access-gs8zm\") pod \"package-server-manager-789f6589d5-2n2hq\" (UID: \"e06e31e8-6210-46ed-99e3-5a0cda45499b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613649 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2xh5\" (UniqueName: \"kubernetes.io/projected/42c99f62-f554-43ce-91f9-fff5b7490f6c-kube-api-access-b2xh5\") pod \"ingress-canary-4nbh2\" (UID: \"42c99f62-f554-43ce-91f9-fff5b7490f6c\") " pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613668 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-config\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613686 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtssk\" (UniqueName: \"kubernetes.io/projected/deddae37-398e-4667-9e96-f6f8f15998c7-kube-api-access-jtssk\") pod \"multus-admission-controller-857f4d67dd-ncgcc\" (UID: \"deddae37-398e-4667-9e96-f6f8f15998c7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613706 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62faea4c-22f9-43c6-9edf-76c832d63659-metrics-tls\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613730 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-auth-proxy-config\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613746 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-client-ca\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613763 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-default-certificate\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613793 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2d4418-325e-4714-9106-95c4464f1b6e-config\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bcad6e4f-bc98-400e-a83f-73e553e9d926-node-bootstrap-token\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613827 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff964b7f-57fd-46ce-a640-e8db42df3acc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613858 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mgzn5\" (UniqueName: \"kubernetes.io/projected/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-kube-api-access-mgzn5\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613874 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-plugins-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613892 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-mountpoint-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613910 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d99ng\" (UniqueName: \"kubernetes.io/projected/ff964b7f-57fd-46ce-a640-e8db42df3acc-kube-api-access-d99ng\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613932 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613950 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613965 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5434e504-53f0-41f5-96bc-1981e69b15ac-config-volume\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.613983 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9fd12d2a-a471-4992-bb1f-170b0019c267-images\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.614973 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-config\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615350 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f52d3b-36b5-4d26-a225-d8601c9c565d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615378 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d8f6396-79a0-4009-aab7-8774b4b051ab-serving-cert\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615408 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4jj8\" (UniqueName: \"kubernetes.io/projected/9d837a1b-0cc6-494a-9680-76de8c16250e-kube-api-access-x4jj8\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615424 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgr2k\" (UniqueName: \"kubernetes.io/projected/9fd12d2a-a471-4992-bb1f-170b0019c267-kube-api-access-qgr2k\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615459 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctxj\" (UniqueName: \"kubernetes.io/projected/ebe11c99-e14e-4390-8fd6-6638f0c6ad16-kube-api-access-dctxj\") pod \"control-plane-machine-set-operator-78cbb6b69f-v5svb\" (UID: \"ebe11c99-e14e-4390-8fd6-6638f0c6ad16\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615476 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sxpj\" (UniqueName: \"kubernetes.io/projected/8b2d4418-325e-4714-9106-95c4464f1b6e-kube-api-access-6sxpj\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615493 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-service-ca\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615518 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9fd12d2a-a471-4992-bb1f-170b0019c267-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615562 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4zr9\" (UniqueName: \"kubernetes.io/projected/9d876d21-ae76-4476-ae9c-8ab29931117d-kube-api-access-n4zr9\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615692 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ae2a9c-3e18-461f-8fb5-1ead8da14023-serving-cert\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rrjd\" (UniqueName: \"kubernetes.io/projected/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-kube-api-access-8rrjd\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615748 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d876d21-ae76-4476-ae9c-8ab29931117d-service-ca-bundle\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.615782 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7fc6b17f-3483-409e-aee4-011ce5afd4c2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.617029 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.117009091 +0000 UTC m=+174.439025049 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.619659 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-tls\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.619673 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-hrxfl"] Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621618 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-stats-auth\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621671 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-srv-cert\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621700 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62faea4c-22f9-43c6-9edf-76c832d63659-trusted-ca\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621731 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dctdr\" (UniqueName: \"kubernetes.io/projected/910c6e78-aa65-4fb5-81b3-60d842e4376a-kube-api-access-dctdr\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621794 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-config\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621819 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2d4418-325e-4714-9106-95c4464f1b6e-serving-cert\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621851 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7fc6b17f-3483-409e-aee4-011ce5afd4c2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621875 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-config\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.621949 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zgxf\" (UniqueName: \"kubernetes.io/projected/a2ac2e2b-d19a-413b-9cfc-c1a8ca008006-kube-api-access-8zgxf\") pod \"downloads-7954f5f757-xpvqq\" (UID: \"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006\") " pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.622004 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14a77e92-7924-4527-a6d2-1fb0ad4d9319-profile-collector-cert\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.623259 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e18500-3b0a-40f6-9901-064d35bb4d17-config\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.623324 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44e18500-3b0a-40f6-9901-064d35bb4d17-serving-cert\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.623352 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kggnb\" (UniqueName: \"kubernetes.io/projected/bcad6e4f-bc98-400e-a83f-73e553e9d926-kube-api-access-kggnb\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.623385 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-certificates\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.623401 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-bound-sa-token\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.623784 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-config\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624012 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-images\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624137 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/44e18500-3b0a-40f6-9901-064d35bb4d17-config\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624413 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-machine-approver-tls\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624587 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bcad6e4f-bc98-400e-a83f-73e553e9d926-certs\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624637 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624657 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e06e31e8-6210-46ed-99e3-5a0cda45499b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2n2hq\" (UID: \"e06e31e8-6210-46ed-99e3-5a0cda45499b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624690 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/910c6e78-aa65-4fb5-81b3-60d842e4376a-signing-key\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624706 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3aa3983f-0743-41e7-aefd-241e19c1d520-tmpfs\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624723 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd12d2a-a471-4992-bb1f-170b0019c267-proxy-tls\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624754 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57796\" (UniqueName: \"kubernetes.io/projected/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-kube-api-access-57796\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624772 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5434e504-53f0-41f5-96bc-1981e69b15ac-secret-volume\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624810 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-config\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624836 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6fl59\" (UniqueName: \"kubernetes.io/projected/44e18500-3b0a-40f6-9901-064d35bb4d17-kube-api-access-6fl59\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624853 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shjjt\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-kube-api-access-shjjt\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624870 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624888 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624917 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq6g5\" (UniqueName: \"kubernetes.io/projected/62faea4c-22f9-43c6-9edf-76c832d63659-kube-api-access-hq6g5\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.624936 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d837a1b-0cc6-494a-9680-76de8c16250e-metrics-tls\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.625200 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-available-featuregates\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.625702 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-auth-proxy-config\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.627078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-certificates\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.627216 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-serving-cert\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.628085 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-client-ca\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.628783 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/44e18500-3b0a-40f6-9901-064d35bb4d17-trusted-ca\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.632863 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7fc6b17f-3483-409e-aee4-011ce5afd4c2-ca-trust-extracted\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.634818 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/798e3302-e232-4fe3-81ed-21656b961de4-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-m9ffp\" (UID: \"798e3302-e232-4fe3-81ed-21656b961de4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.636696 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ae2a9c-3e18-461f-8fb5-1ead8da14023-serving-cert\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.637480 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-config\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.642076 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-machine-approver-tls\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.642433 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/44e18500-3b0a-40f6-9901-064d35bb4d17-serving-cert\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.642625 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7fc6b17f-3483-409e-aee4-011ce5afd4c2-installation-pull-secrets\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.642791 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.643054 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/29f52d3b-36b5-4d26-a225-d8601c9c565d-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.654183 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xsbh4\" (UniqueName: \"kubernetes.io/projected/798e3302-e232-4fe3-81ed-21656b961de4-kube-api-access-xsbh4\") pod \"cluster-samples-operator-665b6dd947-m9ffp\" (UID: \"798e3302-e232-4fe3-81ed-21656b961de4\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.672227 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k4zt\" (UniqueName: \"kubernetes.io/projected/a6ddb63f-7be6-4f40-8b52-a0f8cc52b149-kube-api-access-4k4zt\") pod \"openshift-config-operator-7777fb866f-fj78w\" (UID: \"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.684465 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-nqfn6"] Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.692169 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/29f52d3b-36b5-4d26-a225-d8601c9c565d-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-9fzst\" (UID: \"29f52d3b-36b5-4d26-a225-d8601c9c565d\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.712899 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddbfg\" (UniqueName: \"kubernetes.io/projected/11ae2a9c-3e18-461f-8fb5-1ead8da14023-kube-api-access-ddbfg\") pod \"route-controller-manager-6576b87f9c-lntrx\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.725887 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.726214 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.226175511 +0000 UTC m=+174.548191469 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726363 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14a77e92-7924-4527-a6d2-1fb0ad4d9319-profile-collector-cert\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726431 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kggnb\" (UniqueName: \"kubernetes.io/projected/bcad6e4f-bc98-400e-a83f-73e553e9d926-kube-api-access-kggnb\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726457 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e06e31e8-6210-46ed-99e3-5a0cda45499b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2n2hq\" (UID: \"e06e31e8-6210-46ed-99e3-5a0cda45499b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726482 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bcad6e4f-bc98-400e-a83f-73e553e9d926-certs\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726509 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/910c6e78-aa65-4fb5-81b3-60d842e4376a-signing-key\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726559 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3aa3983f-0743-41e7-aefd-241e19c1d520-tmpfs\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726578 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd12d2a-a471-4992-bb1f-170b0019c267-proxy-tls\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726677 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5434e504-53f0-41f5-96bc-1981e69b15ac-secret-volume\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726730 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726759 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq6g5\" (UniqueName: \"kubernetes.io/projected/62faea4c-22f9-43c6-9edf-76c832d63659-kube-api-access-hq6g5\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726786 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d837a1b-0cc6-494a-9680-76de8c16250e-metrics-tls\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726815 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwrd4\" (UniqueName: \"kubernetes.io/projected/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-kube-api-access-rwrd4\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726843 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-config\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726872 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-client\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726892 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726912 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff964b7f-57fd-46ce-a640-e8db42df3acc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.726940 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2xjll\" (UniqueName: \"kubernetes.io/projected/3aa3983f-0743-41e7-aefd-241e19c1d520-kube-api-access-2xjll\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727040 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/910c6e78-aa65-4fb5-81b3-60d842e4376a-signing-cabundle\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727064 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7bwg\" (UniqueName: \"kubernetes.io/projected/030b2b44-7380-480c-a478-0d42a21a6836-kube-api-access-m7bwg\") pod \"migrator-59844c95c7-vjlkv\" (UID: \"030b2b44-7380-480c-a478-0d42a21a6836\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727197 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2ac97e8-b2ca-4c64-a495-3d415649acf3-proxy-tls\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727227 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3aa3983f-0743-41e7-aefd-241e19c1d520-apiservice-cert\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727260 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-metrics-certs\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727318 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-ca\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727341 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/3d8f6396-79a0-4009-aab7-8774b4b051ab-kube-api-access-rbf6t\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727361 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9slc6\" (UniqueName: \"kubernetes.io/projected/e2ac97e8-b2ca-4c64-a495-3d415649acf3-kube-api-access-9slc6\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727384 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff964b7f-57fd-46ce-a640-e8db42df3acc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727439 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zst5v\" (UniqueName: \"kubernetes.io/projected/14a77e92-7924-4527-a6d2-1fb0ad4d9319-kube-api-access-zst5v\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727459 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deddae37-398e-4667-9e96-f6f8f15998c7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ncgcc\" (UID: \"deddae37-398e-4667-9e96-f6f8f15998c7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727493 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3aa3983f-0743-41e7-aefd-241e19c1d520-webhook-cert\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727525 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c99f62-f554-43ce-91f9-fff5b7490f6c-cert\") pod \"ingress-canary-4nbh2\" (UID: \"42c99f62-f554-43ce-91f9-fff5b7490f6c\") " pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727544 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62faea4c-22f9-43c6-9edf-76c832d63659-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727564 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727589 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-registration-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727628 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rds2h\" (UniqueName: \"kubernetes.io/projected/375c7798-d728-48b0-ac0d-27ba8f57a393-kube-api-access-rds2h\") pod \"auto-csr-approver-29566622-xd9xt\" (UID: \"375c7798-d728-48b0-ac0d-27ba8f57a393\") " pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727647 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14a77e92-7924-4527-a6d2-1fb0ad4d9319-srv-cert\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727686 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rtxc8\" (UniqueName: \"kubernetes.io/projected/d03ebcab-e060-45f2-99ea-fb25179f824c-kube-api-access-rtxc8\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727710 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d837a1b-0cc6-494a-9680-76de8c16250e-config-volume\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727734 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9xff4\" (UniqueName: \"kubernetes.io/projected/5434e504-53f0-41f5-96bc-1981e69b15ac-kube-api-access-9xff4\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727756 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-socket-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727788 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebe11c99-e14e-4390-8fd6-6638f0c6ad16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v5svb\" (UID: \"ebe11c99-e14e-4390-8fd6-6638f0c6ad16\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727820 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-csi-data-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727845 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtssk\" (UniqueName: \"kubernetes.io/projected/deddae37-398e-4667-9e96-f6f8f15998c7-kube-api-access-jtssk\") pod \"multus-admission-controller-857f4d67dd-ncgcc\" (UID: \"deddae37-398e-4667-9e96-f6f8f15998c7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727866 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2ac97e8-b2ca-4c64-a495-3d415649acf3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727891 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gs8zm\" (UniqueName: \"kubernetes.io/projected/e06e31e8-6210-46ed-99e3-5a0cda45499b-kube-api-access-gs8zm\") pod \"package-server-manager-789f6589d5-2n2hq\" (UID: \"e06e31e8-6210-46ed-99e3-5a0cda45499b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727913 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2xh5\" (UniqueName: \"kubernetes.io/projected/42c99f62-f554-43ce-91f9-fff5b7490f6c-kube-api-access-b2xh5\") pod \"ingress-canary-4nbh2\" (UID: \"42c99f62-f554-43ce-91f9-fff5b7490f6c\") " pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727936 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-default-certificate\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727964 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62faea4c-22f9-43c6-9edf-76c832d63659-metrics-tls\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.727972 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/3aa3983f-0743-41e7-aefd-241e19c1d520-tmpfs\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728000 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2d4418-325e-4714-9106-95c4464f1b6e-config\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728031 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bcad6e4f-bc98-400e-a83f-73e553e9d926-node-bootstrap-token\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728058 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mgzn5\" (UniqueName: \"kubernetes.io/projected/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-kube-api-access-mgzn5\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728082 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff964b7f-57fd-46ce-a640-e8db42df3acc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728106 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-mountpoint-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728128 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-plugins-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728153 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d99ng\" (UniqueName: \"kubernetes.io/projected/ff964b7f-57fd-46ce-a640-e8db42df3acc-kube-api-access-d99ng\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728196 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728219 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728242 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5434e504-53f0-41f5-96bc-1981e69b15ac-config-volume\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728266 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d8f6396-79a0-4009-aab7-8774b4b051ab-serving-cert\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728290 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9fd12d2a-a471-4992-bb1f-170b0019c267-images\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728314 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4jj8\" (UniqueName: \"kubernetes.io/projected/9d837a1b-0cc6-494a-9680-76de8c16250e-kube-api-access-x4jj8\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728338 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgr2k\" (UniqueName: \"kubernetes.io/projected/9fd12d2a-a471-4992-bb1f-170b0019c267-kube-api-access-qgr2k\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728367 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctxj\" (UniqueName: \"kubernetes.io/projected/ebe11c99-e14e-4390-8fd6-6638f0c6ad16-kube-api-access-dctxj\") pod \"control-plane-machine-set-operator-78cbb6b69f-v5svb\" (UID: \"ebe11c99-e14e-4390-8fd6-6638f0c6ad16\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728395 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-service-ca\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728421 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sxpj\" (UniqueName: \"kubernetes.io/projected/8b2d4418-325e-4714-9106-95c4464f1b6e-kube-api-access-6sxpj\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728447 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9fd12d2a-a471-4992-bb1f-170b0019c267-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728484 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4zr9\" (UniqueName: \"kubernetes.io/projected/9d876d21-ae76-4476-ae9c-8ab29931117d-kube-api-access-n4zr9\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728514 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d876d21-ae76-4476-ae9c-8ab29931117d-service-ca-bundle\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728541 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-stats-auth\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728563 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-srv-cert\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728584 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62faea4c-22f9-43c6-9edf-76c832d63659-trusted-ca\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728628 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dctdr\" (UniqueName: \"kubernetes.io/projected/910c6e78-aa65-4fb5-81b3-60d842e4376a-kube-api-access-dctdr\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728657 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2d4418-325e-4714-9106-95c4464f1b6e-serving-cert\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728680 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-config\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728709 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8zgxf\" (UniqueName: \"kubernetes.io/projected/a2ac2e2b-d19a-413b-9cfc-c1a8ca008006-kube-api-access-8zgxf\") pod \"downloads-7954f5f757-xpvqq\" (UID: \"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006\") " pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.732256 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-csi-data-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.732435 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-ca\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.733730 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-socket-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.733967 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/910c6e78-aa65-4fb5-81b3-60d842e4376a-signing-cabundle\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.733991 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.734321 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.735411 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e2ac97e8-b2ca-4c64-a495-3d415649acf3-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.735810 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-mountpoint-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.736167 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-plugins-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.741401 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/d03ebcab-e060-45f2-99ea-fb25179f824c-registration-dir\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.728389 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-config\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.745719 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.245065428 +0000 UTC m=+174.567081386 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.750282 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/9fd12d2a-a471-4992-bb1f-170b0019c267-images\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.761807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/62faea4c-22f9-43c6-9edf-76c832d63659-metrics-tls\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.767631 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-bound-sa-token\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.769418 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8b2d4418-325e-4714-9106-95c4464f1b6e-config\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.770245 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ff964b7f-57fd-46ce-a640-e8db42df3acc-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.770746 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/14a77e92-7924-4527-a6d2-1fb0ad4d9319-profile-collector-cert\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.771111 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/910c6e78-aa65-4fb5-81b3-60d842e4376a-signing-key\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.771697 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/14a77e92-7924-4527-a6d2-1fb0ad4d9319-srv-cert\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.772984 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/ebe11c99-e14e-4390-8fd6-6638f0c6ad16-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-v5svb\" (UID: \"ebe11c99-e14e-4390-8fd6-6638f0c6ad16\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.773876 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/9d837a1b-0cc6-494a-9680-76de8c16250e-metrics-tls\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.775742 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/e06e31e8-6210-46ed-99e3-5a0cda45499b-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-2n2hq\" (UID: \"e06e31e8-6210-46ed-99e3-5a0cda45499b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.779051 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ff964b7f-57fd-46ce-a640-e8db42df3acc-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.780096 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/deddae37-398e-4667-9e96-f6f8f15998c7-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-ncgcc\" (UID: \"deddae37-398e-4667-9e96-f6f8f15998c7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.781136 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-profile-collector-cert\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.783053 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/9fd12d2a-a471-4992-bb1f-170b0019c267-auth-proxy-config\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.785984 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/9fd12d2a-a471-4992-bb1f-170b0019c267-proxy-tls\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.786264 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/42c99f62-f554-43ce-91f9-fff5b7490f6c-cert\") pod \"ingress-canary-4nbh2\" (UID: \"42c99f62-f554-43ce-91f9-fff5b7490f6c\") " pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.786356 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.786935 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3aa3983f-0743-41e7-aefd-241e19c1d520-apiservice-cert\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.787184 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5434e504-53f0-41f5-96bc-1981e69b15ac-config-volume\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.787221 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/62faea4c-22f9-43c6-9edf-76c832d63659-trusted-ca\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.787234 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e2ac97e8-b2ca-4c64-a495-3d415649acf3-proxy-tls\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.788265 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d876d21-ae76-4476-ae9c-8ab29931117d-service-ca-bundle\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.788442 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-default-certificate\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.788537 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-metrics-certs\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.790884 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8b2d4418-325e-4714-9106-95c4464f1b6e-serving-cert\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.791805 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3aa3983f-0743-41e7-aefd-241e19c1d520-webhook-cert\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.791946 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-service-ca\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.792063 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/9d837a1b-0cc6-494a-9680-76de8c16250e-config-volume\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.792314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5434e504-53f0-41f5-96bc-1981e69b15ac-secret-volume\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.792314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3d8f6396-79a0-4009-aab7-8774b4b051ab-config\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.793465 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/bcad6e4f-bc98-400e-a83f-73e553e9d926-certs\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.794453 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3d8f6396-79a0-4009-aab7-8774b4b051ab-etcd-client\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.795114 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/bcad6e4f-bc98-400e-a83f-73e553e9d926-node-bootstrap-token\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.798180 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shjjt\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-kube-api-access-shjjt\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.798796 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/9d876d21-ae76-4476-ae9c-8ab29931117d-stats-auth\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.799446 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.799659 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3d8f6396-79a0-4009-aab7-8774b4b051ab-serving-cert\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.801220 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rrjd\" (UniqueName: \"kubernetes.io/projected/8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5-kube-api-access-8rrjd\") pod \"machine-api-operator-5694c8668f-wxtz6\" (UID: \"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.813179 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-srv-cert\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.816139 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57796\" (UniqueName: \"kubernetes.io/projected/2ac2ac0a-47e1-4dd6-a60f-73b7afe45478-kube-api-access-57796\") pod \"machine-approver-56656f9798-x265h\" (UID: \"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.829071 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.829736 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.329718299 +0000 UTC m=+174.651734257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.845025 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-4jnh4" podStartSLOduration=101.845000417 podStartE2EDuration="1m41.845000417s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:33.843919533 +0000 UTC m=+174.165935491" watchObservedRunningTime="2026-03-20 09:02:33.845000417 +0000 UTC m=+174.167016375" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.847439 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6fl59\" (UniqueName: \"kubernetes.io/projected/44e18500-3b0a-40f6-9901-064d35bb4d17-kube-api-access-6fl59\") pod \"console-operator-58897d9998-9vnqx\" (UID: \"44e18500-3b0a-40f6-9901-064d35bb4d17\") " pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.878285 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kggnb\" (UniqueName: \"kubernetes.io/projected/bcad6e4f-bc98-400e-a83f-73e553e9d926-kube-api-access-kggnb\") pod \"machine-config-server-jn96f\" (UID: \"bcad6e4f-bc98-400e-a83f-73e553e9d926\") " pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.895477 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwrd4\" (UniqueName: \"kubernetes.io/projected/280afdbf-7bbc-4ed8-af19-a5be6f9b401b-kube-api-access-rwrd4\") pod \"olm-operator-6b444d44fb-nkzzb\" (UID: \"280afdbf-7bbc-4ed8-af19-a5be6f9b401b\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.907860 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c03d5c04-9fe4-409f-a13f-5cfd1d3910b3-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-fz8bc\" (UID: \"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.928758 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq6g5\" (UniqueName: \"kubernetes.io/projected/62faea4c-22f9-43c6-9edf-76c832d63659-kube-api-access-hq6g5\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.931476 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:33 crc kubenswrapper[4958]: E0320 09:02:33.932013 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.431987398 +0000 UTC m=+174.754003536 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.941931 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.943044 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.953905 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.959059 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-jn96f" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.959204 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.970206 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" Mar 20 09:02:33 crc kubenswrapper[4958]: I0320 09:02:33.984062 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zgxf\" (UniqueName: \"kubernetes.io/projected/a2ac2e2b-d19a-413b-9cfc-c1a8ca008006-kube-api-access-8zgxf\") pod \"downloads-7954f5f757-xpvqq\" (UID: \"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006\") " pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:33.992099 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:33.998842 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:33.999193 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rds2h\" (UniqueName: \"kubernetes.io/projected/375c7798-d728-48b0-ac0d-27ba8f57a393-kube-api-access-rds2h\") pod \"auto-csr-approver-29566622-xd9xt\" (UID: \"375c7798-d728-48b0-ac0d-27ba8f57a393\") " pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.002857 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rtxc8\" (UniqueName: \"kubernetes.io/projected/d03ebcab-e060-45f2-99ea-fb25179f824c-kube-api-access-rtxc8\") pod \"csi-hostpathplugin-bphsz\" (UID: \"d03ebcab-e060-45f2-99ea-fb25179f824c\") " pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.012689 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9xff4\" (UniqueName: \"kubernetes.io/projected/5434e504-53f0-41f5-96bc-1981e69b15ac-kube-api-access-9xff4\") pod \"collect-profiles-29566620-nzxmw\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.031287 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7bwg\" (UniqueName: \"kubernetes.io/projected/030b2b44-7380-480c-a478-0d42a21a6836-kube-api-access-m7bwg\") pod \"migrator-59844c95c7-vjlkv\" (UID: \"030b2b44-7380-480c-a478-0d42a21a6836\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.032996 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.033299 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.533254846 +0000 UTC m=+174.855270804 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.033792 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.034239 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.534228496 +0000 UTC m=+174.856244454 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.053896 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtssk\" (UniqueName: \"kubernetes.io/projected/deddae37-398e-4667-9e96-f6f8f15998c7-kube-api-access-jtssk\") pod \"multus-admission-controller-857f4d67dd-ncgcc\" (UID: \"deddae37-398e-4667-9e96-f6f8f15998c7\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.066092 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.082066 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rbf6t\" (UniqueName: \"kubernetes.io/projected/3d8f6396-79a0-4009-aab7-8774b4b051ab-kube-api-access-rbf6t\") pod \"etcd-operator-b45778765-gksr4\" (UID: \"3d8f6396-79a0-4009-aab7-8774b4b051ab\") " pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.082484 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.086348 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst"] Mar 20 09:02:34 crc kubenswrapper[4958]: W0320 09:02:34.091709 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcad6e4f_bc98_400e_a83f_73e553e9d926.slice/crio-02bb69d095c964da815ca4b572718c9e5229162e53b6ecadb0747f12ba8ac655 WatchSource:0}: Error finding container 02bb69d095c964da815ca4b572718c9e5229162e53b6ecadb0747f12ba8ac655: Status 404 returned error can't find the container with id 02bb69d095c964da815ca4b572718c9e5229162e53b6ecadb0747f12ba8ac655 Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.096706 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9slc6\" (UniqueName: \"kubernetes.io/projected/e2ac97e8-b2ca-4c64-a495-3d415649acf3-kube-api-access-9slc6\") pod \"machine-config-controller-84d6567774-krlwj\" (UID: \"e2ac97e8-b2ca-4c64-a495-3d415649acf3\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.112110 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.122286 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2xjll\" (UniqueName: \"kubernetes.io/projected/3aa3983f-0743-41e7-aefd-241e19c1d520-kube-api-access-2xjll\") pod \"packageserver-d55dfcdfc-6d4gm\" (UID: \"3aa3983f-0743-41e7-aefd-241e19c1d520\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.130380 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" Mar 20 09:02:34 crc kubenswrapper[4958]: W0320 09:02:34.132415 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac2ac0a_47e1_4dd6_a60f_73b7afe45478.slice/crio-7ce07e1bfce2ddba7b7ee7f1939ef38c59f9c7af83b316458d531915780b54fa WatchSource:0}: Error finding container 7ce07e1bfce2ddba7b7ee7f1939ef38c59f9c7af83b316458d531915780b54fa: Status 404 returned error can't find the container with id 7ce07e1bfce2ddba7b7ee7f1939ef38c59f9c7af83b316458d531915780b54fa Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.135429 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.135674 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.635635989 +0000 UTC m=+174.957651947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.136110 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.136788 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.636777324 +0000 UTC m=+174.958793282 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.137840 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ff964b7f-57fd-46ce-a640-e8db42df3acc-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.138174 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.162639 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zst5v\" (UniqueName: \"kubernetes.io/projected/14a77e92-7924-4527-a6d2-1fb0ad4d9319-kube-api-access-zst5v\") pod \"catalog-operator-68c6474976-55pbg\" (UID: \"14a77e92-7924-4527-a6d2-1fb0ad4d9319\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.181435 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.183138 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.200726 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.202179 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2xh5\" (UniqueName: \"kubernetes.io/projected/42c99f62-f554-43ce-91f9-fff5b7490f6c-kube-api-access-b2xh5\") pod \"ingress-canary-4nbh2\" (UID: \"42c99f62-f554-43ce-91f9-fff5b7490f6c\") " pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.202821 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gs8zm\" (UniqueName: \"kubernetes.io/projected/e06e31e8-6210-46ed-99e3-5a0cda45499b-kube-api-access-gs8zm\") pod \"package-server-manager-789f6589d5-2n2hq\" (UID: \"e06e31e8-6210-46ed-99e3-5a0cda45499b\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.206848 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.225061 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.239366 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.240044 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.740023343 +0000 UTC m=+175.062039301 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.256912 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-4nbh2" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.258391 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d99ng\" (UniqueName: \"kubernetes.io/projected/ff964b7f-57fd-46ce-a640-e8db42df3acc-kube-api-access-d99ng\") pod \"cluster-image-registry-operator-dc59b4c8b-hlv94\" (UID: \"ff964b7f-57fd-46ce-a640-e8db42df3acc\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.266721 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.292521 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgr2k\" (UniqueName: \"kubernetes.io/projected/9fd12d2a-a471-4992-bb1f-170b0019c267-kube-api-access-qgr2k\") pod \"machine-config-operator-74547568cd-sx4tp\" (UID: \"9fd12d2a-a471-4992-bb1f-170b0019c267\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.301100 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mgzn5\" (UniqueName: \"kubernetes.io/projected/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-kube-api-access-mgzn5\") pod \"marketplace-operator-79b997595-2gwpt\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.312368 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/62faea4c-22f9-43c6-9edf-76c832d63659-bound-sa-token\") pod \"ingress-operator-5b745b69d9-9ql8d\" (UID: \"62faea4c-22f9-43c6-9edf-76c832d63659\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.316216 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4jj8\" (UniqueName: \"kubernetes.io/projected/9d837a1b-0cc6-494a-9680-76de8c16250e-kube-api-access-x4jj8\") pod \"dns-default-krxrr\" (UID: \"9d837a1b-0cc6-494a-9680-76de8c16250e\") " pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.320539 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctxj\" (UniqueName: \"kubernetes.io/projected/ebe11c99-e14e-4390-8fd6-6638f0c6ad16-kube-api-access-dctxj\") pod \"control-plane-machine-set-operator-78cbb6b69f-v5svb\" (UID: \"ebe11c99-e14e-4390-8fd6-6638f0c6ad16\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.341608 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.342389 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.842369664 +0000 UTC m=+175.164385622 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.356472 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4zr9\" (UniqueName: \"kubernetes.io/projected/9d876d21-ae76-4476-ae9c-8ab29931117d-kube-api-access-n4zr9\") pod \"router-default-5444994796-7qnx6\" (UID: \"9d876d21-ae76-4476-ae9c-8ab29931117d\") " pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.356976 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sxpj\" (UniqueName: \"kubernetes.io/projected/8b2d4418-325e-4714-9106-95c4464f1b6e-kube-api-access-6sxpj\") pod \"service-ca-operator-777779d784-fqnxh\" (UID: \"8b2d4418-325e-4714-9106-95c4464f1b6e\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.370470 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.384737 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dctdr\" (UniqueName: \"kubernetes.io/projected/910c6e78-aa65-4fb5-81b3-60d842e4376a-kube-api-access-dctdr\") pod \"service-ca-9c57cc56f-cx5r7\" (UID: \"910c6e78-aa65-4fb5-81b3-60d842e4376a\") " pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.386879 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.406293 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.406496 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.418401 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hrxfl" event={"ID":"460baf6e-b4fd-4f68-804b-86d4767241d1","Type":"ContainerStarted","Data":"01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.418462 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hrxfl" event={"ID":"460baf6e-b4fd-4f68-804b-86d4767241d1","Type":"ContainerStarted","Data":"f00415f5e6083c444597746260f452c5d13d3b01e4c601e45a8f5d505dbf5164"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.419430 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.433792 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" event={"ID":"4bcf0d08-8af2-46b0-9695-bd37f4bee24b","Type":"ContainerStarted","Data":"e248726b1cfabaf9a92890b0e406af073a430f3c1d113d287a055a8327e2a2db"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.433892 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" event={"ID":"4bcf0d08-8af2-46b0-9695-bd37f4bee24b","Type":"ContainerStarted","Data":"cc2d3c421411b75004122862282ad38b896c7f4d10166369641d68ed7d827ed2"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.445089 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.445419 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:34.945403606 +0000 UTC m=+175.267419564 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.448457 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.463158 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.465774 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.498459 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" event={"ID":"bbe16922-1799-410a-bf9f-56b3818a7e94","Type":"ContainerStarted","Data":"1abaefa8d66d449c10bd7082a000102e24ca847a1acf0eec6ce3b558dabc64bb"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.498519 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" event={"ID":"f0d46cc6-8881-4edc-b186-4388a3ced86b","Type":"ContainerStarted","Data":"934c5319d68c73092ab86ef08fec226aaa3ef1b9336eb5d2143484cc37f5a70a"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.498537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" event={"ID":"f0d46cc6-8881-4edc-b186-4388a3ced86b","Type":"ContainerStarted","Data":"b114a9b0404fe94cfefb642d5c235ceabaec05bf6786b5b797d9ca3a5f2386b5"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.498549 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jn96f" event={"ID":"bcad6e4f-bc98-400e-a83f-73e553e9d926","Type":"ContainerStarted","Data":"02bb69d095c964da815ca4b572718c9e5229162e53b6ecadb0747f12ba8ac655"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.498749 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" event={"ID":"e0046d0b-d22b-4637-96c5-c9dfe397ebe7","Type":"ContainerStarted","Data":"33bee30e4bf4f94180a2a3c8ae4631521169830a3f029be9a5ae2469a63695b9"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.498774 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" event={"ID":"e0046d0b-d22b-4637-96c5-c9dfe397ebe7","Type":"ContainerStarted","Data":"d80ebf1d477aea94e4e04df49873d4b598381e904a73ed6901ba7988afed9f0f"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.519020 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.521620 4958 generic.go:334] "Generic (PLEG): container finished" podID="b57755bc-b4cd-4b4f-b040-381c0e98b166" containerID="244ecd76b69daea3ae45e740372ea63d44d9812905912433215d8360490f01e6" exitCode=0 Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.521716 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" event={"ID":"b57755bc-b4cd-4b4f-b040-381c0e98b166","Type":"ContainerDied","Data":"244ecd76b69daea3ae45e740372ea63d44d9812905912433215d8360490f01e6"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.521746 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" event={"ID":"b57755bc-b4cd-4b4f-b040-381c0e98b166","Type":"ContainerStarted","Data":"88d490fcb073f5f82005931aa547db201bdde959b1db38e4b643424b971030e4"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.546347 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.547123 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" event={"ID":"0051c7c2-c695-478a-b746-554f8c649495","Type":"ContainerStarted","Data":"bbc206d8619416371eb4482753136be222e2bc499e3ffdd1d44bce66f7c70a8e"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.547199 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" event={"ID":"0051c7c2-c695-478a-b746-554f8c649495","Type":"ContainerStarted","Data":"1f88be1e980186073d74b0450c4a5610bd779680c06d5e92eec737daee57db19"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.547542 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.550506 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.050482501 +0000 UTC m=+175.372498459 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.578943 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" event={"ID":"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478","Type":"ContainerStarted","Data":"7ce07e1bfce2ddba7b7ee7f1939ef38c59f9c7af83b316458d531915780b54fa"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.616075 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" event={"ID":"29f52d3b-36b5-4d26-a225-d8601c9c565d","Type":"ContainerStarted","Data":"46fe02bb571a81288dbf656d1fe6cde70a865c440da6e34fcb1e08bd4f3d23b3"} Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.643207 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.655376 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.663751 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.664965 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.164939273 +0000 UTC m=+175.486955231 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.749307 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" podStartSLOduration=102.749278764 podStartE2EDuration="1m42.749278764s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:34.744358504 +0000 UTC m=+175.066374462" watchObservedRunningTime="2026-03-20 09:02:34.749278764 +0000 UTC m=+175.071294722" Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.774128 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.786143 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.286115411 +0000 UTC m=+175.608131369 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.875520 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.876004 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.375987011 +0000 UTC m=+175.698002969 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:34 crc kubenswrapper[4958]: I0320 09:02:34.978522 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:34 crc kubenswrapper[4958]: E0320 09:02:34.979059 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.479039863 +0000 UTC m=+175.801055811 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.084419 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.084916 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.584891952 +0000 UTC m=+175.906907910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.099902 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-rklt9" podStartSLOduration=103.099882191 podStartE2EDuration="1m43.099882191s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:35.056342789 +0000 UTC m=+175.378358747" watchObservedRunningTime="2026-03-20 09:02:35.099882191 +0000 UTC m=+175.421898149" Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.186502 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.187200 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.687175322 +0000 UTC m=+176.009191290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.287549 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.288209 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.788192402 +0000 UTC m=+176.110208360 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.388943 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.389258 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.889243294 +0000 UTC m=+176.211259252 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.490940 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" podStartSLOduration=103.490920145 podStartE2EDuration="1m43.490920145s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:35.481733965 +0000 UTC m=+175.803749923" watchObservedRunningTime="2026-03-20 09:02:35.490920145 +0000 UTC m=+175.812936103" Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.492012 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.492486 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:35.992453252 +0000 UTC m=+176.314469210 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.595194 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.595785 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.095746883 +0000 UTC m=+176.417762831 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.651843 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" event={"ID":"bbe16922-1799-410a-bf9f-56b3818a7e94","Type":"ContainerStarted","Data":"cf64e388ba36ce2a67a95479d4f5fc41fca156a568549db90118df4d68ebfd83"} Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.678244 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-jn96f" event={"ID":"bcad6e4f-bc98-400e-a83f-73e553e9d926","Type":"ContainerStarted","Data":"3726dac79b7aa6dfaf40d8ea58076ad626c5721c4c7f1d239d5e22de9846dd62"} Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.691639 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7qnx6" event={"ID":"9d876d21-ae76-4476-ae9c-8ab29931117d","Type":"ContainerStarted","Data":"5a55d9c5da4757f4241aeda03039907bd2c847be6d9543bf5b8ef0097276adbf"} Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.691687 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-7qnx6" event={"ID":"9d876d21-ae76-4476-ae9c-8ab29931117d","Type":"ContainerStarted","Data":"ec399f30e10c96bce7f95e3d38ffd14a99faf1861a368d0cf518a5c2b98c14fa"} Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.695980 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.696460 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.196442204 +0000 UTC m=+176.518458162 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.777738 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" event={"ID":"29f52d3b-36b5-4d26-a225-d8601c9c565d","Type":"ContainerStarted","Data":"590f7669e41b8d8f41999a6a104ad9439e9f1b2b163e000e45adad0b6544cb06"} Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.800691 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.804478 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.304458809 +0000 UTC m=+176.626474767 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.907749 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:35 crc kubenswrapper[4958]: E0320 09:02:35.909336 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.409312317 +0000 UTC m=+176.731328275 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:35 crc kubenswrapper[4958]: I0320 09:02:35.947086 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp"] Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.012388 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.012777 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.512763302 +0000 UTC m=+176.834779260 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.038344 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" podStartSLOduration=103.038323884 podStartE2EDuration="1m43.038323884s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.035869419 +0000 UTC m=+176.357885377" watchObservedRunningTime="2026-03-20 09:02:36.038323884 +0000 UTC m=+176.360339842" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.044821 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-fj78w"] Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.064572 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-hrxfl" podStartSLOduration=104.064556957 podStartE2EDuration="1m44.064556957s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.064053691 +0000 UTC m=+176.386069649" watchObservedRunningTime="2026-03-20 09:02:36.064556957 +0000 UTC m=+176.386572915" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.069040 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb"] Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.108631 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-9fzst" podStartSLOduration=103.108582424 podStartE2EDuration="1m43.108582424s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.10586989 +0000 UTC m=+176.427885848" watchObservedRunningTime="2026-03-20 09:02:36.108582424 +0000 UTC m=+176.430598392" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.114110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.114546 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.614524875 +0000 UTC m=+176.936540833 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.217411 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.217849 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.717835326 +0000 UTC m=+177.039851284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: W0320 09:02:36.230465 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda6ddb63f_7be6_4f40_8b52_a0f8cc52b149.slice/crio-87b54aa9401ddd01964a9a777481f0343c007cbe06b1d154e9692c20f07e2357 WatchSource:0}: Error finding container 87b54aa9401ddd01964a9a777481f0343c007cbe06b1d154e9692c20f07e2357: Status 404 returned error can't find the container with id 87b54aa9401ddd01964a9a777481f0343c007cbe06b1d154e9692c20f07e2357 Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.242860 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-z6ghj" podStartSLOduration=103.242839421 podStartE2EDuration="1m43.242839421s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.242144491 +0000 UTC m=+176.564160449" watchObservedRunningTime="2026-03-20 09:02:36.242839421 +0000 UTC m=+176.564855379" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.243782 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-rfb76" podStartSLOduration=104.24377707 podStartE2EDuration="1m44.24377707s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.164036481 +0000 UTC m=+176.486052439" watchObservedRunningTime="2026-03-20 09:02:36.24377707 +0000 UTC m=+176.565793028" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.321841 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.322292 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.822263102 +0000 UTC m=+177.144279060 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.360989 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-f8fn6" podStartSLOduration=103.360959865 podStartE2EDuration="1m43.360959865s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.346236075 +0000 UTC m=+176.668252033" watchObservedRunningTime="2026-03-20 09:02:36.360959865 +0000 UTC m=+176.682975823" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.378961 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-jn96f" podStartSLOduration=5.378941225 podStartE2EDuration="5.378941225s" podCreationTimestamp="2026-03-20 09:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.378126281 +0000 UTC m=+176.700142229" watchObservedRunningTime="2026-03-20 09:02:36.378941225 +0000 UTC m=+176.700957183" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.432995 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.433403 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:36.933390072 +0000 UTC m=+177.255406030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.460036 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-7qnx6" podStartSLOduration=103.460018946 podStartE2EDuration="1m43.460018946s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.40946651 +0000 UTC m=+176.731482478" watchObservedRunningTime="2026-03-20 09:02:36.460018946 +0000 UTC m=+176.782034904" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.461681 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.461740 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.477978 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.541674 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.541838 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.041812939 +0000 UTC m=+177.363828897 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.542230 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.542550 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.042535231 +0000 UTC m=+177.364551189 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.643722 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.644585 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.144270694 +0000 UTC m=+177.466286652 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.758342 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.758936 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.258909002 +0000 UTC m=+177.580924960 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.859670 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.859875 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.35984299 +0000 UTC m=+177.681858948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.860008 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.860427 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.360419887 +0000 UTC m=+177.682435845 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.863155 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" event={"ID":"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478","Type":"ContainerStarted","Data":"891bd0035202b1693d773919e95d263d0e6c916e7cbab687bd2ad26f75dcadf7"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.863205 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" event={"ID":"2ac2ac0a-47e1-4dd6-a60f-73b7afe45478","Type":"ContainerStarted","Data":"f2606f7b78dbc1608fbd3092db50af5f10f2be5ed6a225c3bf44bb67eb9cea19"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.873905 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" podStartSLOduration=104.873872569 podStartE2EDuration="1m44.873872569s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.46207121 +0000 UTC m=+176.784087158" watchObservedRunningTime="2026-03-20 09:02:36.873872569 +0000 UTC m=+177.195888527" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.875714 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-wxtz6"] Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.876112 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" event={"ID":"280afdbf-7bbc-4ed8-af19-a5be6f9b401b","Type":"ContainerStarted","Data":"3b0b2a9ecf5d5b1bd19ca50ad41f36d193d19f0e2185824c4ec575468ac12bc9"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.876154 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" event={"ID":"280afdbf-7bbc-4ed8-af19-a5be6f9b401b","Type":"ContainerStarted","Data":"6b1fd8a227c45a800fd4a970755f95bf0a304046dfc0b05bfcf630dab5d9482f"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.877035 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.890355 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" event={"ID":"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149","Type":"ContainerStarted","Data":"71973ce459fa53b68cab661f8099f2e4a3b1b0f940cdf39b4835cf623fbb3faf"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.890506 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" event={"ID":"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149","Type":"ContainerStarted","Data":"87b54aa9401ddd01964a9a777481f0343c007cbe06b1d154e9692c20f07e2357"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.890827 4958 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-nkzzb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.890914 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" podUID="280afdbf-7bbc-4ed8-af19-a5be6f9b401b" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.900724 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc"] Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.909093 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" event={"ID":"798e3302-e232-4fe3-81ed-21656b961de4","Type":"ContainerStarted","Data":"3bd0c9dbf5ca38df81200e8fea7961436af801e5a6b7f8c1b6feb1dff00a8056"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.909313 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" event={"ID":"798e3302-e232-4fe3-81ed-21656b961de4","Type":"ContainerStarted","Data":"f6b2e18dc8bf598c7cb2d8b464f11c224a601210e7cc39b4f81afa6c491a7a6e"} Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.950150 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-x265h" podStartSLOduration=104.950128712 podStartE2EDuration="1m44.950128712s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.889833847 +0000 UTC m=+177.211849795" watchObservedRunningTime="2026-03-20 09:02:36.950128712 +0000 UTC m=+177.272144670" Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.950660 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-9vnqx"] Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.964925 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:36 crc kubenswrapper[4958]: E0320 09:02:36.966424 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.4664057 +0000 UTC m=+177.788421658 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:36 crc kubenswrapper[4958]: I0320 09:02:36.982808 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" event={"ID":"e0046d0b-d22b-4637-96c5-c9dfe397ebe7","Type":"ContainerStarted","Data":"660ea992294293fa0338151bc9b6a8a73365058ad5aab1997d5b2f807c04a107"} Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.008909 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.010029 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.024283 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" podStartSLOduration=104.02425632 podStartE2EDuration="1m44.02425632s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:36.932156692 +0000 UTC m=+177.254172650" watchObservedRunningTime="2026-03-20 09:02:37.02425632 +0000 UTC m=+177.346272278" Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.026009 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.036216 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-gksr4"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.037336 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-nqfn6" podStartSLOduration=105.03730929 podStartE2EDuration="1m45.03730929s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:37.020415372 +0000 UTC m=+177.342431330" watchObservedRunningTime="2026-03-20 09:02:37.03730929 +0000 UTC m=+177.359325248" Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.072275 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-4nbh2"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.077484 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" event={"ID":"b57755bc-b4cd-4b4f-b040-381c0e98b166","Type":"ContainerStarted","Data":"1d558efe78d02d19a061ddbb6fd8dfc264b1d34eefc6450292e73d40c845a1fd"} Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.083748 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.084361 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.584339328 +0000 UTC m=+177.906355286 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.127968 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gwpt"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.162670 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-xpvqq"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.170055 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.179972 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-ncgcc"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.185110 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.186806 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.686784533 +0000 UTC m=+178.008800491 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.194384 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.194450 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sczfm"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.199319 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.202367 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.202530 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.246572 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.246651 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.288992 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.289521 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.789506416 +0000 UTC m=+178.111522374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.296203 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-krxrr"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.338210 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-xd9xt"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.357698 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.357793 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.359995 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh"] Mar 20 09:02:37 crc kubenswrapper[4958]: W0320 09:02:37.390007 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3aa3983f_0743_41e7_aefd_241e19c1d520.slice/crio-126c3f906e4655c9d34c40319736894746b19985530b44bdd97da3620354497d WatchSource:0}: Error finding container 126c3f906e4655c9d34c40319736894746b19985530b44bdd97da3620354497d: Status 404 returned error can't find the container with id 126c3f906e4655c9d34c40319736894746b19985530b44bdd97da3620354497d Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.391773 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.402435 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.902393 +0000 UTC m=+178.224408978 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.423176 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj"] Mar 20 09:02:37 crc kubenswrapper[4958]: W0320 09:02:37.425658 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62faea4c_22f9_43c6_9edf_76c832d63659.slice/crio-6a03863c2957e14dbf62bfa66d1409d1abd4a728a2982e886e762b6c8994308e WatchSource:0}: Error finding container 6a03863c2957e14dbf62bfa66d1409d1abd4a728a2982e886e762b6c8994308e: Status 404 returned error can't find the container with id 6a03863c2957e14dbf62bfa66d1409d1abd4a728a2982e886e762b6c8994308e Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.458086 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-bphsz"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.463839 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:37 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:37 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:37 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.463913 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.485711 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.493275 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.493715 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:37.993699173 +0000 UTC m=+178.315715131 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.504002 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-cx5r7"] Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.518364 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.593926 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.594359 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.094340882 +0000 UTC m=+178.416356830 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: W0320 09:02:37.628852 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2ac97e8_b2ca_4c64_a495_3d415649acf3.slice/crio-157f859636c2ab7ab16ab5a774e918c44be5c6483f1eb375f7dadefdffb29122 WatchSource:0}: Error finding container 157f859636c2ab7ab16ab5a774e918c44be5c6483f1eb375f7dadefdffb29122: Status 404 returned error can't find the container with id 157f859636c2ab7ab16ab5a774e918c44be5c6483f1eb375f7dadefdffb29122 Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.695227 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.695733 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.195711475 +0000 UTC m=+178.517727433 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.797186 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.797878 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.29785545 +0000 UTC m=+178.619871408 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.803170 4958 patch_prober.go:28] interesting pod/apiserver-76f77b778f-lv6ph container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]log ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]etcd ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/generic-apiserver-start-informers ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/max-in-flight-filter ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 20 09:02:37 crc kubenswrapper[4958]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 20 09:02:37 crc kubenswrapper[4958]: [-]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa failed: reason withheld Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/project.openshift.io-projectcache ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 20 09:02:37 crc kubenswrapper[4958]: [-]poststarthook/openshift.io-startinformers failed: reason withheld Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 20 09:02:37 crc kubenswrapper[4958]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 20 09:02:37 crc kubenswrapper[4958]: livez check failed Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.803248 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" podUID="bbe16922-1799-410a-bf9f-56b3818a7e94" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:37 crc kubenswrapper[4958]: I0320 09:02:37.899929 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:37 crc kubenswrapper[4958]: E0320 09:02:37.900362 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.400345206 +0000 UTC m=+178.722361154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.001196 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.001412 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.501363667 +0000 UTC m=+178.823379635 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.001499 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.001972 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.501955604 +0000 UTC m=+178.823971552 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.103183 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.103633 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.603531132 +0000 UTC m=+178.925547090 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.103988 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.104948 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.604939946 +0000 UTC m=+178.926955904 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.140965 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" event={"ID":"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4","Type":"ContainerStarted","Data":"6da2b3db01910ff5a949506b9f1fcd89db5d5dcbadc821a053bd820a24a7c37b"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.141029 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" event={"ID":"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4","Type":"ContainerStarted","Data":"9c72d60bc35d6f628d2db1fe380068e5dab129b6be7ed743f2d2f5bb6130d977"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.142416 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.143929 4958 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2gwpt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.144016 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.168078 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" event={"ID":"030b2b44-7380-480c-a478-0d42a21a6836","Type":"ContainerStarted","Data":"94ea997aaaa024fa9e418ee0a2bb534fa0e9191991a90313865b9b37a187f62b"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.168147 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" event={"ID":"030b2b44-7380-480c-a478-0d42a21a6836","Type":"ContainerStarted","Data":"f16bec27a69d3330e10dbd96fc6c4b8a39eac2a880c876385c64365b364e4804"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.173381 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" podStartSLOduration=105.173347169 podStartE2EDuration="1m45.173347169s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.170400579 +0000 UTC m=+178.492416547" watchObservedRunningTime="2026-03-20 09:02:38.173347169 +0000 UTC m=+178.495363127" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.182152 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" event={"ID":"e06e31e8-6210-46ed-99e3-5a0cda45499b","Type":"ContainerStarted","Data":"de5b8465fc2c3c0fd9c0cb4fe26650ab4ffd97d97aaa075473362591dabd2fe8"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.182198 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" event={"ID":"e06e31e8-6210-46ed-99e3-5a0cda45499b","Type":"ContainerStarted","Data":"4a20cef4a7eff5b5f524b24f3690f22f5cd724a7b93b77cc89023680ae3dcedc"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.197431 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" event={"ID":"3d8f6396-79a0-4009-aab7-8774b4b051ab","Type":"ContainerStarted","Data":"16c7dc0b11b8f286fdb5b28a9b3adfaaec8866694c0fb34cd66ebf5df82e453b"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.207310 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.208012 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.707973728 +0000 UTC m=+179.029989686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.217375 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" event={"ID":"8b2d4418-325e-4714-9106-95c4464f1b6e","Type":"ContainerStarted","Data":"543052a651782598f18dccc1be253da1ca8a6fbf8e89df5eaad0002c096da637"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.227777 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4nbh2" event={"ID":"42c99f62-f554-43ce-91f9-fff5b7490f6c","Type":"ContainerStarted","Data":"36ff47ccae5fcb9ddcebf6534783300e93b8ddeb057418886c956fc0a90552cf"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.227864 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-4nbh2" event={"ID":"42c99f62-f554-43ce-91f9-fff5b7490f6c","Type":"ContainerStarted","Data":"80dd47ed346e346393bcce2a4bcaa402c522b9d7133911df9034c604bb73a28f"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.232032 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" event={"ID":"910c6e78-aa65-4fb5-81b3-60d842e4376a","Type":"ContainerStarted","Data":"aa74f4d4a196074bd9c4b8ceeec2862ca99890878821f81accbe6e66a2af550c"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.234256 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" event={"ID":"11ae2a9c-3e18-461f-8fb5-1ead8da14023","Type":"ContainerStarted","Data":"e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.234281 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" event={"ID":"11ae2a9c-3e18-461f-8fb5-1ead8da14023","Type":"ContainerStarted","Data":"38d10be2e342ada33b29d6528ed6e01b87958d9e1e250cef36fca32c040e25da"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.234413 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" podUID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" containerName="route-controller-manager" containerID="cri-o://e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42" gracePeriod=30 Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.234750 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.236074 4958 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-lntrx container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.236150 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" podUID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.240111 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" event={"ID":"14a77e92-7924-4527-a6d2-1fb0ad4d9319","Type":"ContainerStarted","Data":"c32779bf506f550bd8bca860a1b40b67ba1efe7252229dbf5d5448d4d5a04dd1"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.240165 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" event={"ID":"14a77e92-7924-4527-a6d2-1fb0ad4d9319","Type":"ContainerStarted","Data":"df4eb1e6583140810c992ae39594db216c7b2a853a740da8257de49f9c4943e2"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.240571 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.241359 4958 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-55pbg container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" start-of-body= Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.241407 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" podUID="14a77e92-7924-4527-a6d2-1fb0ad4d9319" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.34:8443/healthz\": dial tcp 10.217.0.34:8443: connect: connection refused" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.242418 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-4nbh2" podStartSLOduration=8.242405761 podStartE2EDuration="8.242405761s" podCreationTimestamp="2026-03-20 09:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.241580957 +0000 UTC m=+178.563596915" watchObservedRunningTime="2026-03-20 09:02:38.242405761 +0000 UTC m=+178.564421719" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.243664 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" event={"ID":"62faea4c-22f9-43c6-9edf-76c832d63659","Type":"ContainerStarted","Data":"6a03863c2957e14dbf62bfa66d1409d1abd4a728a2982e886e762b6c8994308e"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.244004 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.244231 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.254289 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" event={"ID":"ff964b7f-57fd-46ce-a640-e8db42df3acc","Type":"ContainerStarted","Data":"22c33e4fc8b7136d4a8c790fd2ec289ea50c46498d8d7711b40ec3ee7e72ad8e"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.262848 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" podStartSLOduration=105.262827717 podStartE2EDuration="1m45.262827717s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.262644861 +0000 UTC m=+178.584660829" watchObservedRunningTime="2026-03-20 09:02:38.262827717 +0000 UTC m=+178.584843675" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.269156 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.269816 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" event={"ID":"5434e504-53f0-41f5-96bc-1981e69b15ac","Type":"ContainerStarted","Data":"1ab2231b2feb70b520e2d4c4a64c8e593e0b445706d2a3bcf04326ff91fc9fcd"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.276363 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" event={"ID":"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5","Type":"ContainerStarted","Data":"388155bf6f0d0adc3f94769a88cefd42ad74c488e0bd1f6f8b4587e2e0ac8034"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.276405 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" event={"ID":"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5","Type":"ContainerStarted","Data":"99ad3093e79fe9e37eccdcdc0cac9e3cd0c793daf103fd1a7293af7292b5d7ac"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.276417 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" event={"ID":"8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5","Type":"ContainerStarted","Data":"4dadad79f2687e2a4a117cd19e2086859631dbc92bc72b76f1a5e49aeb2deb4e"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.279145 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" event={"ID":"798e3302-e232-4fe3-81ed-21656b961de4","Type":"ContainerStarted","Data":"2e6a2d2e4a46dde42c5e89c963bac054c4354105b0ceae1c0f281918773dac79"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.281148 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" event={"ID":"375c7798-d728-48b0-ac0d-27ba8f57a393","Type":"ContainerStarted","Data":"e41a2b806757d534dbbfbefeffcde58161f29ba9d8fa9b754d3558a05536b84d"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.306360 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" event={"ID":"9fd12d2a-a471-4992-bb1f-170b0019c267","Type":"ContainerStarted","Data":"12ad9de40334e0f947895269ea6cd568268bf0c4f3bb3e495a1f997b4edd749b"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.309586 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.314313 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.814269521 +0000 UTC m=+179.136285479 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.314333 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" event={"ID":"deddae37-398e-4667-9e96-f6f8f15998c7","Type":"ContainerStarted","Data":"b41274aaf0db8fda4c01e4a672ae89869bb11619d357652c3968a3c816ca8897"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.318477 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xpvqq" event={"ID":"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006","Type":"ContainerStarted","Data":"83625142a9d866192a26f29b173b699ea60dbc3d113c373c3f02e51d12b9fcdb"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.318564 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xpvqq" event={"ID":"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006","Type":"ContainerStarted","Data":"fd5390ee5473ef20ba92e426c3427db6a8e687dab79985c255dcf428b4ed554c"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.321126 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" event={"ID":"d03ebcab-e060-45f2-99ea-fb25179f824c","Type":"ContainerStarted","Data":"463e46febdc5e90378731e1577dcf4f6d4541fb8ebff657b8055d840466305a7"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.325640 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" podStartSLOduration=105.325619048 podStartE2EDuration="1m45.325619048s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.303197702 +0000 UTC m=+178.625213660" watchObservedRunningTime="2026-03-20 09:02:38.325619048 +0000 UTC m=+178.647635006" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.332098 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" event={"ID":"e2ac97e8-b2ca-4c64-a495-3d415649acf3","Type":"ContainerStarted","Data":"157f859636c2ab7ab16ab5a774e918c44be5c6483f1eb375f7dadefdffb29122"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.336653 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" event={"ID":"44e18500-3b0a-40f6-9901-064d35bb4d17","Type":"ContainerStarted","Data":"7f88fa998a0851a2c78811699eeb5023b29599a98af783fb1950173636197c4a"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.336687 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" event={"ID":"44e18500-3b0a-40f6-9901-064d35bb4d17","Type":"ContainerStarted","Data":"8c119d62201c3de7f3ae807078d7b291133ce429841be4dfa9c3a825ae009963"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.337292 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.339021 4958 patch_prober.go:28] interesting pod/console-operator-58897d9998-9vnqx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" start-of-body= Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.339066 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" podUID="44e18500-3b0a-40f6-9901-064d35bb4d17" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": dial tcp 10.217.0.13:8443: connect: connection refused" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.354314 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-krxrr" event={"ID":"9d837a1b-0cc6-494a-9680-76de8c16250e","Type":"ContainerStarted","Data":"adb677631d0e0be5b8871959a8cf6cb1613e42f24420e8cda8d29a27993a30c5"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.369347 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-m9ffp" podStartSLOduration=106.369324855 podStartE2EDuration="1m46.369324855s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.332351353 +0000 UTC m=+178.654367311" watchObservedRunningTime="2026-03-20 09:02:38.369324855 +0000 UTC m=+178.691340813" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.380264 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" event={"ID":"3aa3983f-0743-41e7-aefd-241e19c1d520","Type":"ContainerStarted","Data":"126c3f906e4655c9d34c40319736894746b19985530b44bdd97da3620354497d"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.388348 4958 generic.go:334] "Generic (PLEG): container finished" podID="a6ddb63f-7be6-4f40-8b52-a0f8cc52b149" containerID="71973ce459fa53b68cab661f8099f2e4a3b1b0f940cdf39b4835cf623fbb3faf" exitCode=0 Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.388435 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" event={"ID":"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149","Type":"ContainerDied","Data":"71973ce459fa53b68cab661f8099f2e4a3b1b0f940cdf39b4835cf623fbb3faf"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.390944 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-wxtz6" podStartSLOduration=105.390928936 podStartE2EDuration="1m45.390928936s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.389863094 +0000 UTC m=+178.711879052" watchObservedRunningTime="2026-03-20 09:02:38.390928936 +0000 UTC m=+178.712944894" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.394350 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" event={"ID":"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3","Type":"ContainerStarted","Data":"6becbccb98a2fe25e29e1ae109a37a8f1968d7b938ab078f22e6e4f5f68fb65f"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.394441 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" event={"ID":"c03d5c04-9fe4-409f-a13f-5cfd1d3910b3","Type":"ContainerStarted","Data":"22c775bd93cb60248b443b2953228db3f4040e3e1bb090c26feda286cb1102d3"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.395956 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" podUID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" containerName="controller-manager" containerID="cri-o://9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c" gracePeriod=30 Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.396073 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" event={"ID":"ebe11c99-e14e-4390-8fd6-6638f0c6ad16","Type":"ContainerStarted","Data":"df4e38f4b6da88726b122381ab7385912cd1da42b922d0ec1e5c0112f834e6a9"} Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.404373 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45736: no serving certificate available for the kubelet" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.407123 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-nkzzb" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.416648 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.418266 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:38.918246942 +0000 UTC m=+179.240262900 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.424816 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-6wbvm" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.446225 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-xpvqq" podStartSLOduration=106.446200057 podStartE2EDuration="1m46.446200057s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.433957022 +0000 UTC m=+178.755972980" watchObservedRunningTime="2026-03-20 09:02:38.446200057 +0000 UTC m=+178.768216015" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.463003 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:38 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:38 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:38 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.463078 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.467490 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" podStartSLOduration=106.467451727 podStartE2EDuration="1m46.467451727s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.461630909 +0000 UTC m=+178.783646867" watchObservedRunningTime="2026-03-20 09:02:38.467451727 +0000 UTC m=+178.789467705" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.490399 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45748: no serving certificate available for the kubelet" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.523494 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.528840 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.028821945 +0000 UTC m=+179.350837903 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.561203 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-fz8bc" podStartSLOduration=105.561159594 podStartE2EDuration="1m45.561159594s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:38.559535745 +0000 UTC m=+178.881551703" watchObservedRunningTime="2026-03-20 09:02:38.561159594 +0000 UTC m=+178.883175552" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.597141 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45754: no serving certificate available for the kubelet" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.624836 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.627763 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.127732261 +0000 UTC m=+179.449748219 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.693150 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45762: no serving certificate available for the kubelet" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.730232 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.730662 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.23064847 +0000 UTC m=+179.552664428 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.805395 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45772: no serving certificate available for the kubelet" Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.831159 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.831525 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.331492855 +0000 UTC m=+179.653508803 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.832096 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.832562 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.332542878 +0000 UTC m=+179.654558836 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.934471 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.934803 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.434761635 +0000 UTC m=+179.756777593 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.935114 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:38 crc kubenswrapper[4958]: E0320 09:02:38.935646 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.435626561 +0000 UTC m=+179.757642519 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:38 crc kubenswrapper[4958]: I0320 09:02:38.950749 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45782: no serving certificate available for the kubelet" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.027557 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-lntrx_11ae2a9c-3e18-461f-8fb5-1ead8da14023/route-controller-manager/0.log" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.028709 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.037397 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.038063 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.538012764 +0000 UTC m=+179.860028722 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.038140 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.039207 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.53919894 +0000 UTC m=+179.861214898 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.082313 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66"] Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.082673 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" containerName="route-controller-manager" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.082697 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" containerName="route-controller-manager" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.082852 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" containerName="route-controller-manager" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.083360 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.101001 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66"] Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.139545 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-config\") pod \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.139643 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-client-ca\") pod \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.139742 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ae2a9c-3e18-461f-8fb5-1ead8da14023-serving-cert\") pod \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.139775 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddbfg\" (UniqueName: \"kubernetes.io/projected/11ae2a9c-3e18-461f-8fb5-1ead8da14023-kube-api-access-ddbfg\") pod \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\" (UID: \"11ae2a9c-3e18-461f-8fb5-1ead8da14023\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.139943 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.140312 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.640294423 +0000 UTC m=+179.962310381 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.144740 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-config" (OuterVolumeSpecName: "config") pod "11ae2a9c-3e18-461f-8fb5-1ead8da14023" (UID: "11ae2a9c-3e18-461f-8fb5-1ead8da14023"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.164529 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11ae2a9c-3e18-461f-8fb5-1ead8da14023-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "11ae2a9c-3e18-461f-8fb5-1ead8da14023" (UID: "11ae2a9c-3e18-461f-8fb5-1ead8da14023"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.167949 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-client-ca" (OuterVolumeSpecName: "client-ca") pod "11ae2a9c-3e18-461f-8fb5-1ead8da14023" (UID: "11ae2a9c-3e18-461f-8fb5-1ead8da14023"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.202641 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11ae2a9c-3e18-461f-8fb5-1ead8da14023-kube-api-access-ddbfg" (OuterVolumeSpecName: "kube-api-access-ddbfg") pod "11ae2a9c-3e18-461f-8fb5-1ead8da14023" (UID: "11ae2a9c-3e18-461f-8fb5-1ead8da14023"). InnerVolumeSpecName "kube-api-access-ddbfg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.204987 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45784: no serving certificate available for the kubelet" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.247148 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-config\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.259285 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.259463 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-client-ca\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.259581 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2a4f4a-c38a-46c9-b757-9437d861719f-serving-cert\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.260073 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq62x\" (UniqueName: \"kubernetes.io/projected/ac2a4f4a-c38a-46c9-b757-9437d861719f-kube-api-access-vq62x\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.260232 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.260244 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/11ae2a9c-3e18-461f-8fb5-1ead8da14023-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.260255 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddbfg\" (UniqueName: \"kubernetes.io/projected/11ae2a9c-3e18-461f-8fb5-1ead8da14023-kube-api-access-ddbfg\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.260272 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/11ae2a9c-3e18-461f-8fb5-1ead8da14023-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.260712 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.760690847 +0000 UTC m=+180.082706795 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.365167 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.365481 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2a4f4a-c38a-46c9-b757-9437d861719f-serving-cert\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.365553 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.865495843 +0000 UTC m=+180.187511811 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.365648 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq62x\" (UniqueName: \"kubernetes.io/projected/ac2a4f4a-c38a-46c9-b757-9437d861719f-kube-api-access-vq62x\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.365908 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-config\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.365991 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.366108 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-client-ca\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.367785 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-client-ca\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.368311 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.868288139 +0000 UTC m=+180.190304307 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.371787 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-config\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.387025 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2a4f4a-c38a-46c9-b757-9437d861719f-serving-cert\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.391357 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.406507 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq62x\" (UniqueName: \"kubernetes.io/projected/ac2a4f4a-c38a-46c9-b757-9437d861719f-kube-api-access-vq62x\") pod \"route-controller-manager-6b7cb48b5d-jsl66\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.423824 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" event={"ID":"ff964b7f-57fd-46ce-a640-e8db42df3acc","Type":"ContainerStarted","Data":"e842880efa80c45055538b385150a27db7903cb48b14bc1d3984fc3dcae7d2c5"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.431630 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" event={"ID":"9fd12d2a-a471-4992-bb1f-170b0019c267","Type":"ContainerStarted","Data":"b39dbe9ff6f32b0e6468f9e1d4616a57554e99449eef1bf361f8e0bd27ca2dca"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.431696 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" event={"ID":"9fd12d2a-a471-4992-bb1f-170b0019c267","Type":"ContainerStarted","Data":"2b50eb01a0a2e092cf1f03595aeea05ad8730586942ba94499e51568d9d8d18c"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.452036 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.464907 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:39 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:39 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:39 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.464991 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.467121 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-hlv94" podStartSLOduration=107.467096292 podStartE2EDuration="1m47.467096292s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.461397838 +0000 UTC m=+179.783413796" watchObservedRunningTime="2026-03-20 09:02:39.467096292 +0000 UTC m=+179.789112250" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.468334 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.468983 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:39.968961239 +0000 UTC m=+180.290977197 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.486639 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" event={"ID":"deddae37-398e-4667-9e96-f6f8f15998c7","Type":"ContainerStarted","Data":"2f10f89539595c7c8b02b1d29c8a3a8de00b9160aee887766c6d62395d81993c"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.487136 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" event={"ID":"deddae37-398e-4667-9e96-f6f8f15998c7","Type":"ContainerStarted","Data":"1ff8686f4bc8b220c99e61f216f8992cab12355cf2afaf46d1764c1a1d40c70f"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.512448 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" event={"ID":"e06e31e8-6210-46ed-99e3-5a0cda45499b","Type":"ContainerStarted","Data":"338165260d91bec325e14c74343e15089b280e4dc9a92acff599c96c28a4b9d5"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.513391 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.519661 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-sx4tp" podStartSLOduration=106.51964649 podStartE2EDuration="1m46.51964649s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.491502429 +0000 UTC m=+179.813518397" watchObservedRunningTime="2026-03-20 09:02:39.51964649 +0000 UTC m=+179.841662438" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.528840 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" event={"ID":"8b2d4418-325e-4714-9106-95c4464f1b6e","Type":"ContainerStarted","Data":"c8bec785b3b69856e0e99f955aa0a8c21f81dcbae0eece9c263cfdfaff20afba"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.552883 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-ncgcc" podStartSLOduration=106.552859626 podStartE2EDuration="1m46.552859626s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.522999103 +0000 UTC m=+179.845015061" watchObservedRunningTime="2026-03-20 09:02:39.552859626 +0000 UTC m=+179.874875584" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.559113 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" event={"ID":"030b2b44-7380-480c-a478-0d42a21a6836","Type":"ContainerStarted","Data":"1d0c72489554bd2aa141927bb08b2debc5ed963f6c477e318efbce4149ad43a3"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.565948 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" event={"ID":"5434e504-53f0-41f5-96bc-1981e69b15ac","Type":"ContainerStarted","Data":"5fe83ebb49b2b9ed133cdce65b5dd206dba5038eb5a663ef3adecb3ba8944ddd"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.569572 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" event={"ID":"3d8f6396-79a0-4009-aab7-8774b4b051ab","Type":"ContainerStarted","Data":"94739ae968edf68d94ba749659173bd96c0e3af696f1f257f4c65e6fae2bba6c"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.570408 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-serving-cert\") pod \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.570518 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-client-ca\") pod \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.570700 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-proxy-ca-bundles\") pod \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.570766 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k7mmp\" (UniqueName: \"kubernetes.io/projected/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-kube-api-access-k7mmp\") pod \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.570792 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-config\") pod \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\" (UID: \"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.571365 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.573517 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-client-ca" (OuterVolumeSpecName: "client-ca") pod "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" (UID: "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.574060 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" (UID: "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.574934 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.074914161 +0000 UTC m=+180.396930319 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.575328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-config" (OuterVolumeSpecName: "config") pod "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" (UID: "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.584150 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" event={"ID":"910c6e78-aa65-4fb5-81b3-60d842e4376a","Type":"ContainerStarted","Data":"0d4194a6903b1cf4d1ed2b5aa9642d5bcdf00eea73101748c6773f4dee722ccc"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.589532 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" (UID: "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.601243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" event={"ID":"e2ac97e8-b2ca-4c64-a495-3d415649acf3","Type":"ContainerStarted","Data":"d7a87f66be944a32239a606d29517b4d5025fe3a3883271a8c318bfd85aadd24"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.601315 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-kube-api-access-k7mmp" (OuterVolumeSpecName: "kube-api-access-k7mmp") pod "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" (UID: "9f2a1ac8-4fa6-424c-a37e-9d8ad771c063"). InnerVolumeSpecName "kube-api-access-k7mmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.601690 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45794: no serving certificate available for the kubelet" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.602848 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" podStartSLOduration=106.602814945 podStartE2EDuration="1m46.602814945s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.549756641 +0000 UTC m=+179.871772599" watchObservedRunningTime="2026-03-20 09:02:39.602814945 +0000 UTC m=+179.924830893" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.603036 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-fqnxh" podStartSLOduration=106.603027301 podStartE2EDuration="1m46.603027301s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.585384601 +0000 UTC m=+179.907400559" watchObservedRunningTime="2026-03-20 09:02:39.603027301 +0000 UTC m=+179.925043279" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.612234 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-krxrr" event={"ID":"9d837a1b-0cc6-494a-9680-76de8c16250e","Type":"ContainerStarted","Data":"2edf3da1af349ccad58a74f8997cb15f80b154961ed60ef106a85edec31f655c"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.644550 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-gksr4" podStartSLOduration=107.644527692 podStartE2EDuration="1m47.644527692s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.643438698 +0000 UTC m=+179.965454656" watchObservedRunningTime="2026-03-20 09:02:39.644527692 +0000 UTC m=+179.966543650" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.670903 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-6576b87f9c-lntrx_11ae2a9c-3e18-461f-8fb5-1ead8da14023/route-controller-manager/0.log" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.670972 4958 generic.go:334] "Generic (PLEG): container finished" podID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" containerID="e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42" exitCode=2 Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.671096 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" event={"ID":"11ae2a9c-3e18-461f-8fb5-1ead8da14023","Type":"ContainerDied","Data":"e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.671140 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" event={"ID":"11ae2a9c-3e18-461f-8fb5-1ead8da14023","Type":"ContainerDied","Data":"38d10be2e342ada33b29d6528ed6e01b87958d9e1e250cef36fca32c040e25da"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.671165 4958 scope.go:117] "RemoveContainer" containerID="e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.671409 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.677716 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.678287 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.678313 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k7mmp\" (UniqueName: \"kubernetes.io/projected/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-kube-api-access-k7mmp\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.678328 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.678341 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.678352 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.679591 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.179569684 +0000 UTC m=+180.501585642 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.696323 4958 generic.go:334] "Generic (PLEG): container finished" podID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" containerID="9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c" exitCode=0 Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.696650 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.696672 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" event={"ID":"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063","Type":"ContainerDied","Data":"9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.701480 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-sczfm" event={"ID":"9f2a1ac8-4fa6-424c-a37e-9d8ad771c063","Type":"ContainerDied","Data":"cc965271203e564d398853d84ef80c1fcea51317b7814c981669241ac5920001"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.723555 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" event={"ID":"ebe11c99-e14e-4390-8fd6-6638f0c6ad16","Type":"ContainerStarted","Data":"dad5de492e4b3b0623f7a1f47cc08905278e9c40f7c4b2626de2008af1535b26"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.745966 4958 scope.go:117] "RemoveContainer" containerID="e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.746970 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" podStartSLOduration=107.746947615 podStartE2EDuration="1m47.746947615s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.74614364 +0000 UTC m=+180.068159598" watchObservedRunningTime="2026-03-20 09:02:39.746947615 +0000 UTC m=+180.068963573" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.755323 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42\": container with ID starting with e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42 not found: ID does not exist" containerID="e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.755403 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42"} err="failed to get container status \"e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42\": rpc error: code = NotFound desc = could not find container \"e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42\": container with ID starting with e2f0f8e568ec60ffb16baf43617fca0e4bff2c2a9d33f5758539bd2d4f5c8f42 not found: ID does not exist" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.755460 4958 scope.go:117] "RemoveContainer" containerID="9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.766035 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" event={"ID":"62faea4c-22f9-43c6-9edf-76c832d63659","Type":"ContainerStarted","Data":"0449dcc5d75020b6947a271e0f1e7445c3552953b2795f7be270406c9c496de5"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.766965 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-vjlkv" podStartSLOduration=106.766936376 podStartE2EDuration="1m46.766936376s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.688690092 +0000 UTC m=+180.010706050" watchObservedRunningTime="2026-03-20 09:02:39.766936376 +0000 UTC m=+180.088952334" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.790476 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.791716 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.291702415 +0000 UTC m=+180.613718373 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.794696 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" event={"ID":"3aa3983f-0743-41e7-aefd-241e19c1d520","Type":"ContainerStarted","Data":"b76e3f215171fdd6a698e41396b7a46c447f62c01df362cd6d9bce512bbd02ee"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.796080 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.799216 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-cx5r7" podStartSLOduration=106.799183333 podStartE2EDuration="1m46.799183333s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.784571386 +0000 UTC m=+180.106587344" watchObservedRunningTime="2026-03-20 09:02:39.799183333 +0000 UTC m=+180.121199291" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.820291 4958 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-6d4gm container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" start-of-body= Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.820370 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" podUID="3aa3983f-0743-41e7-aefd-241e19c1d520" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.23:5443/healthz\": dial tcp 10.217.0.23:5443: connect: connection refused" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.834895 4958 scope.go:117] "RemoveContainer" containerID="9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.837210 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" event={"ID":"a6ddb63f-7be6-4f40-8b52-a0f8cc52b149","Type":"ContainerStarted","Data":"2d03547daf8bb3a24cbcf986c12a6515305d25116617a9b7d49128fea875d4f9"} Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.837265 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.839154 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.842362 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c\": container with ID starting with 9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c not found: ID does not exist" containerID="9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.842438 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c"} err="failed to get container status \"9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c\": rpc error: code = NotFound desc = could not find container \"9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c\": container with ID starting with 9826639fa4218c80ed77aa7af359f0768f33e5d729d80170a03ec2fa1263268c not found: ID does not exist" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.842893 4958 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2gwpt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.842954 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.843333 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.843418 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.850162 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-v5svb" podStartSLOduration=106.850140362 podStartE2EDuration="1m46.850140362s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.821127955 +0000 UTC m=+180.143143913" watchObservedRunningTime="2026-03-20 09:02:39.850140362 +0000 UTC m=+180.172156320" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.851127 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx"] Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.857072 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-lntrx"] Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.868924 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-55pbg" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.887170 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sczfm"] Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.891679 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.893069 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.393044535 +0000 UTC m=+180.715060493 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.925726 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-sczfm"] Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.958695 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" podStartSLOduration=107.958662443 podStartE2EDuration="1m47.958662443s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.928506829 +0000 UTC m=+180.250522787" watchObservedRunningTime="2026-03-20 09:02:39.958662443 +0000 UTC m=+180.280678401" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.963981 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" podStartSLOduration=106.963956505 podStartE2EDuration="1m46.963956505s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.962341616 +0000 UTC m=+180.284357574" watchObservedRunningTime="2026-03-20 09:02:39.963956505 +0000 UTC m=+180.285972463" Mar 20 09:02:39 crc kubenswrapper[4958]: I0320 09:02:39.994243 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:39 crc kubenswrapper[4958]: E0320 09:02:39.997405 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.497382368 +0000 UTC m=+180.819398326 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.002017 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" podStartSLOduration=108.001997189 podStartE2EDuration="1m48.001997189s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:39.989984211 +0000 UTC m=+180.312000169" watchObservedRunningTime="2026-03-20 09:02:40.001997189 +0000 UTC m=+180.324013147" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.015529 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66"] Mar 20 09:02:40 crc kubenswrapper[4958]: W0320 09:02:40.054365 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac2a4f4a_c38a_46c9_b757_9437d861719f.slice/crio-24dd2d2d063e7d3a37d5d24f3f6c1df39f0470202224c6ac6e320b95ae24517b WatchSource:0}: Error finding container 24dd2d2d063e7d3a37d5d24f3f6c1df39f0470202224c6ac6e320b95ae24517b: Status 404 returned error can't find the container with id 24dd2d2d063e7d3a37d5d24f3f6c1df39f0470202224c6ac6e320b95ae24517b Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.096285 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.096644 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.596627584 +0000 UTC m=+180.918643542 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.199104 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.199545 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.699519232 +0000 UTC m=+181.021535190 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.301179 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.301407 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.801366098 +0000 UTC m=+181.123382056 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.301736 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.302167 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.802154752 +0000 UTC m=+181.124170900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.350102 4958 ???:1] "http: TLS handshake error from 192.168.126.11:45810: no serving certificate available for the kubelet" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.402935 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.403199 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.903157452 +0000 UTC m=+181.225173410 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.403359 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.403808 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:40.903793882 +0000 UTC m=+181.225809840 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.465301 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:40 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:40 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:40 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.465367 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.496152 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11ae2a9c-3e18-461f-8fb5-1ead8da14023" path="/var/lib/kubelet/pods/11ae2a9c-3e18-461f-8fb5-1ead8da14023/volumes" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.496806 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" path="/var/lib/kubelet/pods/9f2a1ac8-4fa6-424c-a37e-9d8ad771c063/volumes" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.507134 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.507775 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.007757223 +0000 UTC m=+181.329773181 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.609026 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.609407 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.109392733 +0000 UTC m=+181.431408691 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.713283 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.713788 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.213768136 +0000 UTC m=+181.535784094 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.716104 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.719125 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" containerName="controller-manager" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.719155 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" containerName="controller-manager" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.720466 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9f2a1ac8-4fa6-424c-a37e-9d8ad771c063" containerName="controller-manager" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.734831 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.742280 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.747092 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.763330 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.820254 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.820365 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.820417 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.820874 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.320848723 +0000 UTC m=+181.642864681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.841735 4958 patch_prober.go:28] interesting pod/console-operator-58897d9998-9vnqx container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.841818 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" podUID="44e18500-3b0a-40f6-9901-064d35bb4d17" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.13:8443/readyz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.865882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" event={"ID":"d03ebcab-e060-45f2-99ea-fb25179f824c","Type":"ContainerStarted","Data":"10e9199dd146ea52156d6d76e91acf6af7a6d0cf7db6ad5a3accdacccd3ccca3"} Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.901034 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-9ql8d" event={"ID":"62faea4c-22f9-43c6-9edf-76c832d63659","Type":"ContainerStarted","Data":"0f1981616a5888c00710c75b9cc61f165c3cc4153de787d6d2eb8d7182a1cf8a"} Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.916541 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" event={"ID":"ac2a4f4a-c38a-46c9-b757-9437d861719f","Type":"ContainerStarted","Data":"192abfbb386834965586639ea3cc95ebb9a873bbe0e041470eb66ff21b373454"} Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.916617 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" event={"ID":"ac2a4f4a-c38a-46c9-b757-9437d861719f","Type":"ContainerStarted","Data":"24dd2d2d063e7d3a37d5d24f3f6c1df39f0470202224c6ac6e320b95ae24517b"} Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.917048 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.918372 4958 patch_prober.go:28] interesting pod/route-controller-manager-6b7cb48b5d-jsl66 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.918423 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.921562 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.921935 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.922042 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.922186 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:40 crc kubenswrapper[4958]: E0320 09:02:40.922292 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.422267325 +0000 UTC m=+181.744283283 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.924874 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" event={"ID":"e2ac97e8-b2ca-4c64-a495-3d415649acf3","Type":"ContainerStarted","Data":"55e2c10ba5e78d1cf456a64dca254931bcb7f9783c99f2228cab61715d319e4a"} Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.946157 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-krxrr" event={"ID":"9d837a1b-0cc6-494a-9680-76de8c16250e","Type":"ContainerStarted","Data":"2ab14e8e2668d157aee993f21322f0ed1ace8cf50500e3f7338479fed83ba493"} Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.946218 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.950854 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" podStartSLOduration=3.950823219 podStartE2EDuration="3.950823219s" podCreationTimestamp="2026-03-20 09:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:40.948415286 +0000 UTC m=+181.270431244" watchObservedRunningTime="2026-03-20 09:02:40.950823219 +0000 UTC m=+181.272839177" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.951845 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.951997 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.952423 4958 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-2gwpt container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" start-of-body= Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.952452 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.19:8080/healthz\": dial tcp 10.217.0.19:8080: connect: connection refused" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.974403 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-krxrr" podStartSLOduration=10.974377759 podStartE2EDuration="10.974377759s" podCreationTimestamp="2026-03-20 09:02:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:40.97405532 +0000 UTC m=+181.296071268" watchObservedRunningTime="2026-03-20 09:02:40.974377759 +0000 UTC m=+181.296393717" Mar 20 09:02:40 crc kubenswrapper[4958]: I0320 09:02:40.990761 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.004530 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krlwj" podStartSLOduration=108.004505992 podStartE2EDuration="1m48.004505992s" podCreationTimestamp="2026-03-20 09:00:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:41.000092907 +0000 UTC m=+181.322108875" watchObservedRunningTime="2026-03-20 09:02:41.004505992 +0000 UTC m=+181.326521950" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.024357 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.029366 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.529342642 +0000 UTC m=+181.851358780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.083888 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.132284 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.133341 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.633316503 +0000 UTC m=+181.955332461 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.149429 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-549hv"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.155525 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.163425 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.185254 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-549hv"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.235997 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-utilities\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.236048 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-catalog-content\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.236117 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bzbd9\" (UniqueName: \"kubernetes.io/projected/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-kube-api-access-bzbd9\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.236166 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.236626 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.736585183 +0000 UTC m=+182.058601141 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.339502 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.339741 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.839705967 +0000 UTC m=+182.161721925 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.340405 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bzbd9\" (UniqueName: \"kubernetes.io/projected/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-kube-api-access-bzbd9\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.340480 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.340561 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-utilities\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.340592 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-catalog-content\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.341038 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.841029418 +0000 UTC m=+182.163045376 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.341190 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-catalog-content\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.341535 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-utilities\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.380511 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bzbd9\" (UniqueName: \"kubernetes.io/projected/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-kube-api-access-bzbd9\") pod \"certified-operators-549hv\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.442329 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.443185 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:41.943103381 +0000 UTC m=+182.265119339 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.463675 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:41 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:41 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:41 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.463744 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.533130 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.540370 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.544042 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-875rt"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.544806 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.545274 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.045254916 +0000 UTC m=+182.367270874 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.552566 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.570044 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-875rt"] Mar 20 09:02:41 crc kubenswrapper[4958]: W0320 09:02:41.635773 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8baa5a0b_ecd3_4a29_819f_699c33ae89e6.slice/crio-2eeceefd03da65b6562b7413c4d655e54cabe094fbb5a03287400c08561c1fcd WatchSource:0}: Error finding container 2eeceefd03da65b6562b7413c4d655e54cabe094fbb5a03287400c08561c1fcd: Status 404 returned error can't find the container with id 2eeceefd03da65b6562b7413c4d655e54cabe094fbb5a03287400c08561c1fcd Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.646031 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.646441 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-utilities\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.646552 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjdx2\" (UniqueName: \"kubernetes.io/projected/96818d4d-0c37-4c66-9f05-70d41cefa01d-kube-api-access-bjdx2\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.646688 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-catalog-content\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.646867 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.146847325 +0000 UTC m=+182.468863293 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.715855 4958 ???:1] "http: TLS handshake error from 192.168.126.11:37512: no serving certificate available for the kubelet" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.734908 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-mpjsp"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.747126 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.749480 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjdx2\" (UniqueName: \"kubernetes.io/projected/96818d4d-0c37-4c66-9f05-70d41cefa01d-kube-api-access-bjdx2\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.749525 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.749591 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-catalog-content\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.749673 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-utilities\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.750182 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-utilities\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.750830 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.250814636 +0000 UTC m=+182.572830594 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.754646 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-catalog-content\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.755010 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.788451 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mpjsp"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.836831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjdx2\" (UniqueName: \"kubernetes.io/projected/96818d4d-0c37-4c66-9f05-70d41cefa01d-kube-api-access-bjdx2\") pod \"certified-operators-875rt\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.850729 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.851154 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlt8l\" (UniqueName: \"kubernetes.io/projected/1301d3a7-31fd-44f4-825d-a579e4026c7a-kube-api-access-jlt8l\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.851286 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-catalog-content\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.851311 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-utilities\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.851451 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.351430375 +0000 UTC m=+182.673446333 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.879832 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-6d4gm" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.917720 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.950052 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-m4c8h"] Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.951621 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.953353 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.953405 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-catalog-content\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.953432 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-utilities\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.953479 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlt8l\" (UniqueName: \"kubernetes.io/projected/1301d3a7-31fd-44f4-825d-a579e4026c7a-kube-api-access-jlt8l\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: E0320 09:02:41.954225 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.454209279 +0000 UTC m=+182.776225237 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.954796 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-catalog-content\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.955079 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-utilities\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:41 crc kubenswrapper[4958]: I0320 09:02:41.985152 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m4c8h"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.018282 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlt8l\" (UniqueName: \"kubernetes.io/projected/1301d3a7-31fd-44f4-825d-a579e4026c7a-kube-api-access-jlt8l\") pod \"community-operators-mpjsp\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.021189 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8baa5a0b-ecd3-4a29-819f-699c33ae89e6","Type":"ContainerStarted","Data":"2eeceefd03da65b6562b7413c4d655e54cabe094fbb5a03287400c08561c1fcd"} Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.027654 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.065174 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.066269 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-utilities\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.066552 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-catalog-content\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.066586 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sw8xr\" (UniqueName: \"kubernetes.io/projected/d551e28f-f3d1-4135-bc78-f606120df286-kube-api-access-sw8xr\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.066745 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.566724302 +0000 UTC m=+182.888740260 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.077982 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.079101 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.082049 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.087545 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.087846 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.088069 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.088548 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.088745 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.089341 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.102026 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.111854 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.154324 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-549hv"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.171716 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.171763 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43daafd-85d4-457e-9565-bf4f601ae581-serving-cert\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.171835 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-catalog-content\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.171902 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-config\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.171925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sw8xr\" (UniqueName: \"kubernetes.io/projected/d551e28f-f3d1-4135-bc78-f606120df286-kube-api-access-sw8xr\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.171980 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8cdzx\" (UniqueName: \"kubernetes.io/projected/f43daafd-85d4-457e-9565-bf4f601ae581-kube-api-access-8cdzx\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.172087 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-utilities\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.172119 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-client-ca\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.172177 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-proxy-ca-bundles\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.172305 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.672277741 +0000 UTC m=+182.994293699 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.173746 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-catalog-content\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.180856 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-utilities\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: W0320 09:02:42.190807 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfcb5229f_2b8f_4e6a_8542_cd03b84e9737.slice/crio-cad3dbe1843341eaf6a0fdc589d7828ecba0489d05ce6107941e638cf6856f4b WatchSource:0}: Error finding container cad3dbe1843341eaf6a0fdc589d7828ecba0489d05ce6107941e638cf6856f4b: Status 404 returned error can't find the container with id cad3dbe1843341eaf6a0fdc589d7828ecba0489d05ce6107941e638cf6856f4b Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.204218 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sw8xr\" (UniqueName: \"kubernetes.io/projected/d551e28f-f3d1-4135-bc78-f606120df286-kube-api-access-sw8xr\") pod \"community-operators-m4c8h\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.248693 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.256521 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-lv6ph" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.275471 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.275957 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-config\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.276004 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8cdzx\" (UniqueName: \"kubernetes.io/projected/f43daafd-85d4-457e-9565-bf4f601ae581-kube-api-access-8cdzx\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.276059 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-client-ca\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.276102 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-proxy-ca-bundles\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.276180 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43daafd-85d4-457e-9565-bf4f601ae581-serving-cert\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.278435 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.778408868 +0000 UTC m=+183.100424826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.279346 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-client-ca\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.279831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-config\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.281757 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43daafd-85d4-457e-9565-bf4f601ae581-serving-cert\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.284740 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-proxy-ca-bundles\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.327651 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.327734 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8cdzx\" (UniqueName: \"kubernetes.io/projected/f43daafd-85d4-457e-9565-bf4f601ae581-kube-api-access-8cdzx\") pod \"controller-manager-79fcbf85b8-bk5cz\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.393819 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.394363 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.894341116 +0000 UTC m=+183.216357074 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.451627 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.458182 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:42 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:42 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:42 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.458260 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.494896 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.495506 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:42.99548547 +0000 UTC m=+183.317501428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.557328 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-875rt"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.604299 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.606695 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.106675332 +0000 UTC m=+183.428691290 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.680975 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-mpjsp"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.717150 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.717585 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.217563564 +0000 UTC m=+183.539579522 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.823507 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.824115 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.324095634 +0000 UTC m=+183.646111592 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.896523 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-m4c8h"] Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.930408 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.930561 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.430533971 +0000 UTC m=+183.752549929 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.930763 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:42 crc kubenswrapper[4958]: E0320 09:02:42.931137 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.431119709 +0000 UTC m=+183.753135667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.960652 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-fj78w" Mar 20 09:02:42 crc kubenswrapper[4958]: I0320 09:02:42.972164 4958 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.000986 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz"] Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.032427 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.032760 4958 generic.go:334] "Generic (PLEG): container finished" podID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerID="1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507" exitCode=0 Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.032851 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerDied","Data":"1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.032889 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerStarted","Data":"7b2af21a1ca020f4b44d22b2c9f0f100725c2c7cb29c7dd014e23f8d8582dc6f"} Mar 20 09:02:43 crc kubenswrapper[4958]: E0320 09:02:43.033154 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.533135 +0000 UTC m=+183.855150948 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.037918 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8baa5a0b-ecd3-4a29-819f-699c33ae89e6","Type":"ContainerStarted","Data":"b130bcae9a04f0215a07dfb9c2851d87b765fee7fac8616504b14f03b54c8021"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.038266 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerStarted","Data":"a7b461d3196a9ec1b2875f1bff180e2af981110343219c7dc7214bcbe903a613"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.038306 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerStarted","Data":"844dac9951fadca61dc09bb6fa55c3e3620fcc77cab2e921303f4e8b8f330cc1"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.046459 4958 generic.go:334] "Generic (PLEG): container finished" podID="5434e504-53f0-41f5-96bc-1981e69b15ac" containerID="5fe83ebb49b2b9ed133cdce65b5dd206dba5038eb5a663ef3adecb3ba8944ddd" exitCode=0 Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.046563 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" event={"ID":"5434e504-53f0-41f5-96bc-1981e69b15ac","Type":"ContainerDied","Data":"5fe83ebb49b2b9ed133cdce65b5dd206dba5038eb5a663ef3adecb3ba8944ddd"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.062922 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" event={"ID":"d03ebcab-e060-45f2-99ea-fb25179f824c","Type":"ContainerStarted","Data":"a6bbd1816b848d96043db8d52c10bccaee33eb8e08e3b46172bee3b6a90d110d"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.067581 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerStarted","Data":"31163c1a295b3b71a757b3f8ff3f62466d67e21e65a54f429130de2f7529fea1"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.072252 4958 generic.go:334] "Generic (PLEG): container finished" podID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerID="8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f" exitCode=0 Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.073349 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerDied","Data":"8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.073398 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerStarted","Data":"cad3dbe1843341eaf6a0fdc589d7828ecba0489d05ce6107941e638cf6856f4b"} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.126105 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=3.126074484 podStartE2EDuration="3.126074484s" podCreationTimestamp="2026-03-20 09:02:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:43.110069584 +0000 UTC m=+183.432085572" watchObservedRunningTime="2026-03-20 09:02:43.126074484 +0000 UTC m=+183.448090442" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.137891 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: E0320 09:02:43.138388 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.63837406 +0000 UTC m=+183.960390018 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.239902 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:43 crc kubenswrapper[4958]: E0320 09:02:43.241097 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.741078262 +0000 UTC m=+184.063094220 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.330855 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.330935 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.332806 4958 patch_prober.go:28] interesting pod/console-f9d7485db-hrxfl container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" start-of-body= Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.332885 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-hrxfl" podUID="460baf6e-b4fd-4f68-804b-86d4767241d1" containerName="console" probeResult="failure" output="Get \"https://10.217.0.9:8443/health\": dial tcp 10.217.0.9:8443: connect: connection refused" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.343690 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.343765 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: E0320 09:02:43.344218 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.844203448 +0000 UTC m=+184.166219406 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.347658 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z8j2r"] Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.348963 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8j2r"] Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.348995 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.394088 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.394427 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.410202 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/14288bf2-b6fe-4961-ad00-a39f76ff1a78-metrics-certs\") pod \"network-metrics-daemon-trr7n\" (UID: \"14288bf2-b6fe-4961-ad00-a39f76ff1a78\") " pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.444610 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.444867 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-utilities\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.444946 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-catalog-content\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.444998 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8kh6\" (UniqueName: \"kubernetes.io/projected/c97ca1fb-e042-4273-b024-bc9dbc806359-kube-api-access-z8kh6\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: E0320 09:02:43.445895 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 09:02:43.945857349 +0000 UTC m=+184.267873307 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.454322 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:43 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:43 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:43 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.454412 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.548304 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.548362 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-catalog-content\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.548394 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8kh6\" (UniqueName: \"kubernetes.io/projected/c97ca1fb-e042-4273-b024-bc9dbc806359-kube-api-access-z8kh6\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.548491 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-utilities\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.549348 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-utilities\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: E0320 09:02:43.549693 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 09:02:44.049679175 +0000 UTC m=+184.371695123 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-flhr9" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.550041 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-catalog-content\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.571164 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8kh6\" (UniqueName: \"kubernetes.io/projected/c97ca1fb-e042-4273-b024-bc9dbc806359-kube-api-access-z8kh6\") pod \"redhat-marketplace-z8j2r\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.610570 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.619393 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-trr7n" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.620289 4958 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T09:02:42.972188985Z","Handler":null,"Name":""} Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.631715 4958 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.631757 4958 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.650156 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.655857 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.736439 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.740560 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9xwld"] Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.745190 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.755100 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.760866 4958 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.760933 4958 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.816298 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xwld"] Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.876261 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-catalog-content\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.876317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bsgc6\" (UniqueName: \"kubernetes.io/projected/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-kube-api-access-bsgc6\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.876450 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-utilities\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.918378 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-flhr9\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.927660 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.936349 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.977833 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bsgc6\" (UniqueName: \"kubernetes.io/projected/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-kube-api-access-bsgc6\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.978046 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-utilities\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.978104 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-catalog-content\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.979272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-utilities\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:43 crc kubenswrapper[4958]: I0320 09:02:43.987213 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-catalog-content\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.014996 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-9vnqx" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.038650 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bsgc6\" (UniqueName: \"kubernetes.io/projected/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-kube-api-access-bsgc6\") pod \"redhat-marketplace-9xwld\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.111747 4958 generic.go:334] "Generic (PLEG): container finished" podID="d551e28f-f3d1-4135-bc78-f606120df286" containerID="6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b" exitCode=0 Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.112694 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerDied","Data":"6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.127110 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.139668 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" event={"ID":"f43daafd-85d4-457e-9565-bf4f601ae581","Type":"ContainerStarted","Data":"510b557960e38e3dab6aa02adee1fe5860c55540dda48794a9b24bc46dcd63f8"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.139733 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" event={"ID":"f43daafd-85d4-457e-9565-bf4f601ae581","Type":"ContainerStarted","Data":"c1ec04c54e85f9f88c259cc002351221b1e4c64ac05641f677cf4e2755b8f775"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.140211 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.176125 4958 generic.go:334] "Generic (PLEG): container finished" podID="8baa5a0b-ecd3-4a29-819f-699c33ae89e6" containerID="b130bcae9a04f0215a07dfb9c2851d87b765fee7fac8616504b14f03b54c8021" exitCode=0 Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.176201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8baa5a0b-ecd3-4a29-819f-699c33ae89e6","Type":"ContainerDied","Data":"b130bcae9a04f0215a07dfb9c2851d87b765fee7fac8616504b14f03b54c8021"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.192333 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.192415 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.192506 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.192566 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.197809 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-trr7n"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.212042 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.212712 4958 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.214186 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" podStartSLOduration=7.214167045 podStartE2EDuration="7.214167045s" podCreationTimestamp="2026-03-20 09:02:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:44.211132423 +0000 UTC m=+184.533148381" watchObservedRunningTime="2026-03-20 09:02:44.214167045 +0000 UTC m=+184.536183003" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.249061 4958 generic.go:334] "Generic (PLEG): container finished" podID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerID="a7b461d3196a9ec1b2875f1bff180e2af981110343219c7dc7214bcbe903a613" exitCode=0 Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.249187 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerDied","Data":"a7b461d3196a9ec1b2875f1bff180e2af981110343219c7dc7214bcbe903a613"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.371264 4958 ???:1] "http: TLS handshake error from 192.168.126.11:37528: no serving certificate available for the kubelet" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.372611 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" event={"ID":"d03ebcab-e060-45f2-99ea-fb25179f824c","Type":"ContainerStarted","Data":"8ffba6c0830162aaddb3d2c5776e546144f19c0fcb5ab2a694c28e4b08474c8e"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.373118 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" event={"ID":"d03ebcab-e060-45f2-99ea-fb25179f824c","Type":"ContainerStarted","Data":"70e241790e0b44bd73cbb18bf6d398fa524c75cf5fd5d914736132277109543b"} Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.387815 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.389081 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.393512 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.395185 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.403663 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.442092 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.451735 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-bphsz" podStartSLOduration=13.451701543 podStartE2EDuration="13.451701543s" podCreationTimestamp="2026-03-20 09:02:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:44.420551271 +0000 UTC m=+184.742567229" watchObservedRunningTime="2026-03-20 09:02:44.451701543 +0000 UTC m=+184.773717501" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.473538 4958 patch_prober.go:28] interesting pod/router-default-5444994796-7qnx6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 09:02:44 crc kubenswrapper[4958]: [-]has-synced failed: reason withheld Mar 20 09:02:44 crc kubenswrapper[4958]: [+]process-running ok Mar 20 09:02:44 crc kubenswrapper[4958]: healthz check failed Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.473590 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-7qnx6" podUID="9d876d21-ae76-4476-ae9c-8ab29931117d" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.494586 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.494755 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.595778 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.596615 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.596649 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p5nh9"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.600144 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.600271 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.600893 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.614276 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5nh9"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.614457 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.615293 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8j2r"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.618129 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.629689 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.705999 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l74ms\" (UniqueName: \"kubernetes.io/projected/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-kube-api-access-l74ms\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.706194 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-utilities\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.706317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-catalog-content\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.789465 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.793492 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-flhr9"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.809811 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-catalog-content\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.809889 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l74ms\" (UniqueName: \"kubernetes.io/projected/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-kube-api-access-l74ms\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.809958 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-utilities\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.810568 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-catalog-content\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.811970 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-utilities\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.837059 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l74ms\" (UniqueName: \"kubernetes.io/projected/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-kube-api-access-l74ms\") pod \"redhat-operators-p5nh9\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.898376 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xwld"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.939354 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smdkg"] Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.948468 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:44 crc kubenswrapper[4958]: I0320 09:02:44.970933 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smdkg"] Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.000937 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.012635 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-catalog-content\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.012717 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-utilities\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.012780 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2fps\" (UniqueName: \"kubernetes.io/projected/faa90514-f83a-442b-9d17-08ff904728f2-kube-api-access-q2fps\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: W0320 09:02:45.023024 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf21e8593_4125_4ea1_ad7f_be4bb994ed6e.slice/crio-c19db44c2cc9ae35a82449d1efe2d336b7972ed261ac04c8b7f132a57184ccf1 WatchSource:0}: Error finding container c19db44c2cc9ae35a82449d1efe2d336b7972ed261ac04c8b7f132a57184ccf1: Status 404 returned error can't find the container with id c19db44c2cc9ae35a82449d1efe2d336b7972ed261ac04c8b7f132a57184ccf1 Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.074536 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.114888 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2fps\" (UniqueName: \"kubernetes.io/projected/faa90514-f83a-442b-9d17-08ff904728f2-kube-api-access-q2fps\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.115019 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-catalog-content\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.115054 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-utilities\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.115573 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-utilities\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.116244 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-catalog-content\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.179519 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2fps\" (UniqueName: \"kubernetes.io/projected/faa90514-f83a-442b-9d17-08ff904728f2-kube-api-access-q2fps\") pod \"redhat-operators-smdkg\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.215898 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xff4\" (UniqueName: \"kubernetes.io/projected/5434e504-53f0-41f5-96bc-1981e69b15ac-kube-api-access-9xff4\") pod \"5434e504-53f0-41f5-96bc-1981e69b15ac\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.216142 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5434e504-53f0-41f5-96bc-1981e69b15ac-config-volume\") pod \"5434e504-53f0-41f5-96bc-1981e69b15ac\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.216294 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5434e504-53f0-41f5-96bc-1981e69b15ac-secret-volume\") pod \"5434e504-53f0-41f5-96bc-1981e69b15ac\" (UID: \"5434e504-53f0-41f5-96bc-1981e69b15ac\") " Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.219138 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5434e504-53f0-41f5-96bc-1981e69b15ac-config-volume" (OuterVolumeSpecName: "config-volume") pod "5434e504-53f0-41f5-96bc-1981e69b15ac" (UID: "5434e504-53f0-41f5-96bc-1981e69b15ac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.222419 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5434e504-53f0-41f5-96bc-1981e69b15ac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5434e504-53f0-41f5-96bc-1981e69b15ac" (UID: "5434e504-53f0-41f5-96bc-1981e69b15ac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.239334 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5434e504-53f0-41f5-96bc-1981e69b15ac-kube-api-access-9xff4" (OuterVolumeSpecName: "kube-api-access-9xff4") pod "5434e504-53f0-41f5-96bc-1981e69b15ac" (UID: "5434e504-53f0-41f5-96bc-1981e69b15ac"). InnerVolumeSpecName "kube-api-access-9xff4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.290697 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.331933 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5434e504-53f0-41f5-96bc-1981e69b15ac-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.332363 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5434e504-53f0-41f5-96bc-1981e69b15ac-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.332375 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xff4\" (UniqueName: \"kubernetes.io/projected/5434e504-53f0-41f5-96bc-1981e69b15ac-kube-api-access-9xff4\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.420066 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" event={"ID":"7fc6b17f-3483-409e-aee4-011ce5afd4c2","Type":"ContainerStarted","Data":"f24ac4694c5b9dbd1a9eb6564ebeee67309d3f4d13e26f98ac55148eea17fa12"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.420106 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" event={"ID":"7fc6b17f-3483-409e-aee4-011ce5afd4c2","Type":"ContainerStarted","Data":"ab63a23295153380160611432f61a3d1bd726635050a765f6700f8ca28a4194d"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.420943 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.424194 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-trr7n" event={"ID":"14288bf2-b6fe-4961-ad00-a39f76ff1a78","Type":"ContainerStarted","Data":"4c4802bb1bb59c6b1fe2398dbaee20adea1dacd42a78025d5fe9baac73d29b0e"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.424254 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-trr7n" event={"ID":"14288bf2-b6fe-4961-ad00-a39f76ff1a78","Type":"ContainerStarted","Data":"4f156567d929134aeff4d2c1f6f82c6d68f0bfc21b89be3cf12b954312c83db5"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.427811 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" event={"ID":"5434e504-53f0-41f5-96bc-1981e69b15ac","Type":"ContainerDied","Data":"1ab2231b2feb70b520e2d4c4a64c8e593e0b445706d2a3bcf04326ff91fc9fcd"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.427851 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1ab2231b2feb70b520e2d4c4a64c8e593e0b445706d2a3bcf04326ff91fc9fcd" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.427883 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.430322 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerStarted","Data":"c19db44c2cc9ae35a82449d1efe2d336b7972ed261ac04c8b7f132a57184ccf1"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.434818 4958 generic.go:334] "Generic (PLEG): container finished" podID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerID="58e7e8c24e35be1d9a5b6c9decfcd600d5441afe9fa4da4377219fb39637ab71" exitCode=0 Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.435247 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerDied","Data":"58e7e8c24e35be1d9a5b6c9decfcd600d5441afe9fa4da4377219fb39637ab71"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.435305 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerStarted","Data":"628e318d42108a9b4a134e2ac237c451e8568e1574268dc323742c4d0135ffad"} Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.438212 4958 ???:1] "http: TLS handshake error from 192.168.126.11:37542: no serving certificate available for the kubelet" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.480898 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" podStartSLOduration=113.480866732 podStartE2EDuration="1m53.480866732s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:45.465167262 +0000 UTC m=+185.787183220" watchObservedRunningTime="2026-03-20 09:02:45.480866732 +0000 UTC m=+185.802882690" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.489521 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.505580 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-7qnx6" Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.645114 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 09:02:45 crc kubenswrapper[4958]: W0320 09:02:45.702472 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pode26c6dd5_c47d_4e08_8fdf_e4a4c3e46bc0.slice/crio-bc0ae1f38eaf5a5d4376f612b0f808c61b48d9da4baaeba6b953fd12a9fce28e WatchSource:0}: Error finding container bc0ae1f38eaf5a5d4376f612b0f808c61b48d9da4baaeba6b953fd12a9fce28e: Status 404 returned error can't find the container with id bc0ae1f38eaf5a5d4376f612b0f808c61b48d9da4baaeba6b953fd12a9fce28e Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.757742 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p5nh9"] Mar 20 09:02:45 crc kubenswrapper[4958]: I0320 09:02:45.915546 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smdkg"] Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.093831 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.156673 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kube-api-access\") pod \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.156735 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kubelet-dir\") pod \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\" (UID: \"8baa5a0b-ecd3-4a29-819f-699c33ae89e6\") " Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.156903 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8baa5a0b-ecd3-4a29-819f-699c33ae89e6" (UID: "8baa5a0b-ecd3-4a29-819f-699c33ae89e6"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.157189 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.173308 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8baa5a0b-ecd3-4a29-819f-699c33ae89e6" (UID: "8baa5a0b-ecd3-4a29-819f-699c33ae89e6"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.260895 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8baa5a0b-ecd3-4a29-819f-699c33ae89e6-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.461537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerStarted","Data":"07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.461593 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerStarted","Data":"ff00aded18fe65038227f78eab1ededad4551f257fe3c0f805cab24c97bce612"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.466064 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-trr7n" event={"ID":"14288bf2-b6fe-4961-ad00-a39f76ff1a78","Type":"ContainerStarted","Data":"4ff2186fc091e1a0011e81a86da322a28a1436fac1f83fb3e2ea67827278161d"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.481738 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerStarted","Data":"c32e251289438dca04f9f1f8bc8e949811c0f70f58ee1bc6242a9c5c9922fa4e"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.481795 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerStarted","Data":"307af55e839e94ca4aa26086003cc12be08cc61758452900b3809dba41aee089"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.498621 4958 generic.go:334] "Generic (PLEG): container finished" podID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerID="395afe424d1d4901498ff41ef21c320b812e38a35d0662178cd19fee2806bf1d" exitCode=0 Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.498963 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerDied","Data":"395afe424d1d4901498ff41ef21c320b812e38a35d0662178cd19fee2806bf1d"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.502560 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0","Type":"ContainerStarted","Data":"bc0ae1f38eaf5a5d4376f612b0f808c61b48d9da4baaeba6b953fd12a9fce28e"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.504944 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.512230 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"8baa5a0b-ecd3-4a29-819f-699c33ae89e6","Type":"ContainerDied","Data":"2eeceefd03da65b6562b7413c4d655e54cabe094fbb5a03287400c08561c1fcd"} Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.512274 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2eeceefd03da65b6562b7413c4d655e54cabe094fbb5a03287400c08561c1fcd" Mar 20 09:02:46 crc kubenswrapper[4958]: I0320 09:02:46.521965 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-trr7n" podStartSLOduration=114.521941406 podStartE2EDuration="1m54.521941406s" podCreationTimestamp="2026-03-20 09:00:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:02:46.493161745 +0000 UTC m=+186.815177723" watchObservedRunningTime="2026-03-20 09:02:46.521941406 +0000 UTC m=+186.843957364" Mar 20 09:02:47 crc kubenswrapper[4958]: I0320 09:02:47.551856 4958 generic.go:334] "Generic (PLEG): container finished" podID="faa90514-f83a-442b-9d17-08ff904728f2" containerID="07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241" exitCode=0 Mar 20 09:02:47 crc kubenswrapper[4958]: I0320 09:02:47.551967 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerDied","Data":"07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241"} Mar 20 09:02:47 crc kubenswrapper[4958]: I0320 09:02:47.568878 4958 generic.go:334] "Generic (PLEG): container finished" podID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerID="c32e251289438dca04f9f1f8bc8e949811c0f70f58ee1bc6242a9c5c9922fa4e" exitCode=0 Mar 20 09:02:47 crc kubenswrapper[4958]: I0320 09:02:47.570133 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerDied","Data":"c32e251289438dca04f9f1f8bc8e949811c0f70f58ee1bc6242a9c5c9922fa4e"} Mar 20 09:02:47 crc kubenswrapper[4958]: I0320 09:02:47.598673 4958 generic.go:334] "Generic (PLEG): container finished" podID="e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0" containerID="69445bbb775104aeea03a4d24b973114cdb21416528f83f62c8dcefeeaef3f25" exitCode=0 Mar 20 09:02:47 crc kubenswrapper[4958]: I0320 09:02:47.598819 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0","Type":"ContainerDied","Data":"69445bbb775104aeea03a4d24b973114cdb21416528f83f62c8dcefeeaef3f25"} Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.234668 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.388369 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kube-api-access\") pod \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.388457 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kubelet-dir\") pod \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\" (UID: \"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0\") " Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.388763 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0" (UID: "e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.403827 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0" (UID: "e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.490961 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.491022 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.547817 4958 ???:1] "http: TLS handshake error from 192.168.126.11:37556: no serving certificate available for the kubelet" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.550449 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-krxrr" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.714240 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0","Type":"ContainerDied","Data":"bc0ae1f38eaf5a5d4376f612b0f808c61b48d9da4baaeba6b953fd12a9fce28e"} Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.714304 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bc0ae1f38eaf5a5d4376f612b0f808c61b48d9da4baaeba6b953fd12a9fce28e" Mar 20 09:02:49 crc kubenswrapper[4958]: I0320 09:02:49.714400 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 09:02:53 crc kubenswrapper[4958]: I0320 09:02:53.404163 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:53 crc kubenswrapper[4958]: I0320 09:02:53.409783 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:02:54 crc kubenswrapper[4958]: I0320 09:02:54.184063 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:02:54 crc kubenswrapper[4958]: I0320 09:02:54.184112 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:02:54 crc kubenswrapper[4958]: I0320 09:02:54.184150 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:02:54 crc kubenswrapper[4958]: I0320 09:02:54.184200 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:02:56 crc kubenswrapper[4958]: I0320 09:02:56.580111 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz"] Mar 20 09:02:56 crc kubenswrapper[4958]: I0320 09:02:56.580423 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" containerID="cri-o://510b557960e38e3dab6aa02adee1fe5860c55540dda48794a9b24bc46dcd63f8" gracePeriod=30 Mar 20 09:02:56 crc kubenswrapper[4958]: I0320 09:02:56.595094 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66"] Mar 20 09:02:56 crc kubenswrapper[4958]: I0320 09:02:56.595410 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" containerID="cri-o://192abfbb386834965586639ea3cc95ebb9a873bbe0e041470eb66ff21b373454" gracePeriod=30 Mar 20 09:02:56 crc kubenswrapper[4958]: I0320 09:02:56.968874 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:02:58 crc kubenswrapper[4958]: I0320 09:02:58.827910 4958 generic.go:334] "Generic (PLEG): container finished" podID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerID="192abfbb386834965586639ea3cc95ebb9a873bbe0e041470eb66ff21b373454" exitCode=0 Mar 20 09:02:58 crc kubenswrapper[4958]: I0320 09:02:58.828380 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" event={"ID":"ac2a4f4a-c38a-46c9-b757-9437d861719f","Type":"ContainerDied","Data":"192abfbb386834965586639ea3cc95ebb9a873bbe0e041470eb66ff21b373454"} Mar 20 09:02:58 crc kubenswrapper[4958]: I0320 09:02:58.830789 4958 generic.go:334] "Generic (PLEG): container finished" podID="f43daafd-85d4-457e-9565-bf4f601ae581" containerID="510b557960e38e3dab6aa02adee1fe5860c55540dda48794a9b24bc46dcd63f8" exitCode=0 Mar 20 09:02:58 crc kubenswrapper[4958]: I0320 09:02:58.830836 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" event={"ID":"f43daafd-85d4-457e-9565-bf4f601ae581","Type":"ContainerDied","Data":"510b557960e38e3dab6aa02adee1fe5860c55540dda48794a9b24bc46dcd63f8"} Mar 20 09:02:59 crc kubenswrapper[4958]: I0320 09:02:59.454182 4958 patch_prober.go:28] interesting pod/route-controller-manager-6b7cb48b5d-jsl66 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 09:02:59 crc kubenswrapper[4958]: I0320 09:02:59.454305 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 09:03:02 crc kubenswrapper[4958]: I0320 09:03:02.454306 4958 patch_prober.go:28] interesting pod/controller-manager-79fcbf85b8-bk5cz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 09:03:02 crc kubenswrapper[4958]: I0320 09:03:02.454451 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 09:03:03 crc kubenswrapper[4958]: I0320 09:03:03.946569 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184194 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184196 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184267 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184339 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184411 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184862 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.184906 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.185079 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="download-server" containerStatusID={"Type":"cri-o","ID":"83625142a9d866192a26f29b173b699ea60dbc3d113c373c3f02e51d12b9fcdb"} pod="openshift-console/downloads-7954f5f757-xpvqq" containerMessage="Container download-server failed liveness probe, will be restarted" Mar 20 09:03:04 crc kubenswrapper[4958]: I0320 09:03:04.185138 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" containerID="cri-o://83625142a9d866192a26f29b173b699ea60dbc3d113c373c3f02e51d12b9fcdb" gracePeriod=2 Mar 20 09:03:05 crc kubenswrapper[4958]: I0320 09:03:05.883100 4958 generic.go:334] "Generic (PLEG): container finished" podID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerID="83625142a9d866192a26f29b173b699ea60dbc3d113c373c3f02e51d12b9fcdb" exitCode=0 Mar 20 09:03:05 crc kubenswrapper[4958]: I0320 09:03:05.883506 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xpvqq" event={"ID":"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006","Type":"ContainerDied","Data":"83625142a9d866192a26f29b173b699ea60dbc3d113c373c3f02e51d12b9fcdb"} Mar 20 09:03:09 crc kubenswrapper[4958]: I0320 09:03:09.454532 4958 patch_prober.go:28] interesting pod/route-controller-manager-6b7cb48b5d-jsl66 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 09:03:09 crc kubenswrapper[4958]: I0320 09:03:09.454627 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 09:03:10 crc kubenswrapper[4958]: I0320 09:03:10.055706 4958 ???:1] "http: TLS handshake error from 192.168.126.11:37724: no serving certificate available for the kubelet" Mar 20 09:03:12 crc kubenswrapper[4958]: I0320 09:03:12.454327 4958 patch_prober.go:28] interesting pod/controller-manager-79fcbf85b8-bk5cz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 09:03:12 crc kubenswrapper[4958]: I0320 09:03:12.454671 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.378324 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.378404 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.378455 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.378496 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.380875 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.381232 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.381911 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.392346 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.398870 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.406581 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.407011 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.415400 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.591228 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.600213 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 09:03:13 crc kubenswrapper[4958]: I0320 09:03:13.616299 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.183955 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.184023 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.212633 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-2n2hq" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.485858 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 09:03:14 crc kubenswrapper[4958]: E0320 09:03:14.486197 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5434e504-53f0-41f5-96bc-1981e69b15ac" containerName="collect-profiles" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.486219 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5434e504-53f0-41f5-96bc-1981e69b15ac" containerName="collect-profiles" Mar 20 09:03:14 crc kubenswrapper[4958]: E0320 09:03:14.486236 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8baa5a0b-ecd3-4a29-819f-699c33ae89e6" containerName="pruner" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.486246 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8baa5a0b-ecd3-4a29-819f-699c33ae89e6" containerName="pruner" Mar 20 09:03:14 crc kubenswrapper[4958]: E0320 09:03:14.486262 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0" containerName="pruner" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.486274 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0" containerName="pruner" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.486475 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8baa5a0b-ecd3-4a29-819f-699c33ae89e6" containerName="pruner" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.486495 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e26c6dd5-c47d-4e08-8fdf-e4a4c3e46bc0" containerName="pruner" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.486512 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5434e504-53f0-41f5-96bc-1981e69b15ac" containerName="collect-profiles" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.487224 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.489260 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.492174 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.500980 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.599699 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.599791 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.701565 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.701685 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.701729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.737958 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:14 crc kubenswrapper[4958]: I0320 09:03:14.828737 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:16 crc kubenswrapper[4958]: E0320 09:03:16.602856 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 09:03:16 crc kubenswrapper[4958]: E0320 09:03:16.603100 4958 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 09:03:16 crc kubenswrapper[4958]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 09:03:16 crc kubenswrapper[4958]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-rds2h,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566622-xd9xt_openshift-infra(375c7798-d728-48b0-ac0d-27ba8f57a393): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 09:03:16 crc kubenswrapper[4958]: > logger="UnhandledError" Mar 20 09:03:16 crc kubenswrapper[4958]: E0320 09:03:16.604341 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" podUID="375c7798-d728-48b0-ac0d-27ba8f57a393" Mar 20 09:03:16 crc kubenswrapper[4958]: E0320 09:03:16.958197 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" podUID="375c7798-d728-48b0-ac0d-27ba8f57a393" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.455525 4958 patch_prober.go:28] interesting pod/route-controller-manager-6b7cb48b5d-jsl66 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.455630 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.683453 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.684777 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.699336 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.774574 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-var-lock\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.774706 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.774731 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kube-api-access\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.876306 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-var-lock\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.876377 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.876394 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kube-api-access\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.876481 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-var-lock\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.876563 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kubelet-dir\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:19 crc kubenswrapper[4958]: I0320 09:03:19.893774 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kube-api-access\") pod \"installer-9-crc\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:20 crc kubenswrapper[4958]: I0320 09:03:20.010519 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:03:22 crc kubenswrapper[4958]: E0320 09:03:22.060152 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\": context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 09:03:22 crc kubenswrapper[4958]: E0320 09:03:22.060374 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l74ms,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-p5nh9_openshift-marketplace(ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\": context canceled" logger="UnhandledError" Mar 20 09:03:22 crc kubenswrapper[4958]: E0320 09:03:22.061740 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: reading blob sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363: Get \\\"https://registry.redhat.io/v2/redhat/redhat-operator-index/blobs/sha256:cf6d845794adf5448325bc506389d32e0330b3e9db6bf5f46ec1e824f4c04363\\\": context canceled\"" pod="openshift-marketplace/redhat-operators-p5nh9" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" Mar 20 09:03:22 crc kubenswrapper[4958]: E0320 09:03:22.619474 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Unknown desc = copying system image from manifest list: reading blob sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051: fetching blob: received unexpected HTTP status: 502 Bad Gateway" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 09:03:22 crc kubenswrapper[4958]: E0320 09:03:22.620204 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jlt8l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-mpjsp_openshift-marketplace(1301d3a7-31fd-44f4-825d-a579e4026c7a): ErrImagePull: copying system image from manifest list: reading blob sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051: fetching blob: received unexpected HTTP status: 502 Bad Gateway" logger="UnhandledError" Mar 20 09:03:22 crc kubenswrapper[4958]: E0320 09:03:22.622744 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"copying system image from manifest list: reading blob sha256:2086b7801d96d309e48e1c678789d95541de89bbae905e6f5a8de845927ca051: fetching blob: received unexpected HTTP status: 502 Bad Gateway\"" pod="openshift-marketplace/community-operators-mpjsp" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" Mar 20 09:03:23 crc kubenswrapper[4958]: I0320 09:03:23.459199 4958 patch_prober.go:28] interesting pod/controller-manager-79fcbf85b8-bk5cz container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 09:03:23 crc kubenswrapper[4958]: I0320 09:03:23.459322 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 09:03:23 crc kubenswrapper[4958]: E0320 09:03:23.801170 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-mpjsp" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" Mar 20 09:03:23 crc kubenswrapper[4958]: E0320 09:03:23.804173 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-p5nh9" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" Mar 20 09:03:23 crc kubenswrapper[4958]: I0320 09:03:23.934108 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:03:23 crc kubenswrapper[4958]: I0320 09:03:23.940038 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.019725 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-86f549b547-kgjx5"] Mar 20 09:03:24 crc kubenswrapper[4958]: E0320 09:03:24.020266 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.020286 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" Mar 20 09:03:24 crc kubenswrapper[4958]: E0320 09:03:24.020298 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.020306 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.020431 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" containerName="controller-manager" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.020442 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" containerName="route-controller-manager" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.020965 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.021208 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" event={"ID":"ac2a4f4a-c38a-46c9-b757-9437d861719f","Type":"ContainerDied","Data":"24dd2d2d063e7d3a37d5d24f3f6c1df39f0470202224c6ac6e320b95ae24517b"} Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.021287 4958 scope.go:117] "RemoveContainer" containerID="192abfbb386834965586639ea3cc95ebb9a873bbe0e041470eb66ff21b373454" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.021458 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.029098 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" event={"ID":"f43daafd-85d4-457e-9565-bf4f601ae581","Type":"ContainerDied","Data":"c1ec04c54e85f9f88c259cc002351221b1e4c64ac05641f677cf4e2755b8f775"} Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.029237 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.030562 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86f549b547-kgjx5"] Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.038497 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-config\") pod \"ac2a4f4a-c38a-46c9-b757-9437d861719f\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.038841 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-client-ca\") pod \"ac2a4f4a-c38a-46c9-b757-9437d861719f\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.038961 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2a4f4a-c38a-46c9-b757-9437d861719f-serving-cert\") pod \"ac2a4f4a-c38a-46c9-b757-9437d861719f\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.039081 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-config\") pod \"f43daafd-85d4-457e-9565-bf4f601ae581\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.039235 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-client-ca\") pod \"f43daafd-85d4-457e-9565-bf4f601ae581\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.039347 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8cdzx\" (UniqueName: \"kubernetes.io/projected/f43daafd-85d4-457e-9565-bf4f601ae581-kube-api-access-8cdzx\") pod \"f43daafd-85d4-457e-9565-bf4f601ae581\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.039538 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq62x\" (UniqueName: \"kubernetes.io/projected/ac2a4f4a-c38a-46c9-b757-9437d861719f-kube-api-access-vq62x\") pod \"ac2a4f4a-c38a-46c9-b757-9437d861719f\" (UID: \"ac2a4f4a-c38a-46c9-b757-9437d861719f\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.039711 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43daafd-85d4-457e-9565-bf4f601ae581-serving-cert\") pod \"f43daafd-85d4-457e-9565-bf4f601ae581\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.040141 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-proxy-ca-bundles\") pod \"f43daafd-85d4-457e-9565-bf4f601ae581\" (UID: \"f43daafd-85d4-457e-9565-bf4f601ae581\") " Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.040016 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-config" (OuterVolumeSpecName: "config") pod "ac2a4f4a-c38a-46c9-b757-9437d861719f" (UID: "ac2a4f4a-c38a-46c9-b757-9437d861719f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.040823 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-client-ca" (OuterVolumeSpecName: "client-ca") pod "f43daafd-85d4-457e-9565-bf4f601ae581" (UID: "f43daafd-85d4-457e-9565-bf4f601ae581"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.042189 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-client-ca" (OuterVolumeSpecName: "client-ca") pod "ac2a4f4a-c38a-46c9-b757-9437d861719f" (UID: "ac2a4f4a-c38a-46c9-b757-9437d861719f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.042490 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f43daafd-85d4-457e-9565-bf4f601ae581" (UID: "f43daafd-85d4-457e-9565-bf4f601ae581"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.047818 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-config" (OuterVolumeSpecName: "config") pod "f43daafd-85d4-457e-9565-bf4f601ae581" (UID: "f43daafd-85d4-457e-9565-bf4f601ae581"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.048789 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac2a4f4a-c38a-46c9-b757-9437d861719f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ac2a4f4a-c38a-46c9-b757-9437d861719f" (UID: "ac2a4f4a-c38a-46c9-b757-9437d861719f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.049456 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f43daafd-85d4-457e-9565-bf4f601ae581-kube-api-access-8cdzx" (OuterVolumeSpecName: "kube-api-access-8cdzx") pod "f43daafd-85d4-457e-9565-bf4f601ae581" (UID: "f43daafd-85d4-457e-9565-bf4f601ae581"). InnerVolumeSpecName "kube-api-access-8cdzx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.049822 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f43daafd-85d4-457e-9565-bf4f601ae581-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f43daafd-85d4-457e-9565-bf4f601ae581" (UID: "f43daafd-85d4-457e-9565-bf4f601ae581"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.054291 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac2a4f4a-c38a-46c9-b757-9437d861719f-kube-api-access-vq62x" (OuterVolumeSpecName: "kube-api-access-vq62x") pod "ac2a4f4a-c38a-46c9-b757-9437d861719f" (UID: "ac2a4f4a-c38a-46c9-b757-9437d861719f"). InnerVolumeSpecName "kube-api-access-vq62x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141631 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-99vcv\" (UniqueName: \"kubernetes.io/projected/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-kube-api-access-99vcv\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141719 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-config\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141742 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-serving-cert\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141771 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-client-ca\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-proxy-ca-bundles\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141899 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq62x\" (UniqueName: \"kubernetes.io/projected/ac2a4f4a-c38a-46c9-b757-9437d861719f-kube-api-access-vq62x\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141916 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f43daafd-85d4-457e-9565-bf4f601ae581-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141930 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141944 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141955 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ac2a4f4a-c38a-46c9-b757-9437d861719f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141970 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ac2a4f4a-c38a-46c9-b757-9437d861719f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141981 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.141992 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f43daafd-85d4-457e-9565-bf4f601ae581-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.142004 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8cdzx\" (UniqueName: \"kubernetes.io/projected/f43daafd-85d4-457e-9565-bf4f601ae581-kube-api-access-8cdzx\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.185547 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.185643 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.243757 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-config\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.243809 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-serving-cert\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.243837 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-client-ca\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.244719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-proxy-ca-bundles\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.244997 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-client-ca\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.245042 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-99vcv\" (UniqueName: \"kubernetes.io/projected/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-kube-api-access-99vcv\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.245155 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-proxy-ca-bundles\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.245284 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-config\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.250469 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-serving-cert\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.263472 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-99vcv\" (UniqueName: \"kubernetes.io/projected/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-kube-api-access-99vcv\") pod \"controller-manager-86f549b547-kgjx5\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.337946 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.351612 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66"] Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.355000 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6b7cb48b5d-jsl66"] Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.370725 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz"] Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.375007 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-79fcbf85b8-bk5cz"] Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.441372 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac2a4f4a-c38a-46c9-b757-9437d861719f" path="/var/lib/kubelet/pods/ac2a4f4a-c38a-46c9-b757-9437d861719f/volumes" Mar 20 09:03:24 crc kubenswrapper[4958]: I0320 09:03:24.442046 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f43daafd-85d4-457e-9565-bf4f601ae581" path="/var/lib/kubelet/pods/f43daafd-85d4-457e-9565-bf4f601ae581/volumes" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.091709 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm"] Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.092894 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.095068 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.095442 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.095896 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.096027 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.097446 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.097656 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.102357 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm"] Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.173369 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-serving-cert\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.173432 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rkd5\" (UniqueName: \"kubernetes.io/projected/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-kube-api-access-5rkd5\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.173471 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-client-ca\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.173768 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-config\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.275018 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-serving-cert\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.275102 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rkd5\" (UniqueName: \"kubernetes.io/projected/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-kube-api-access-5rkd5\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.275169 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-client-ca\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.275276 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-config\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.277487 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-client-ca\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.277915 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-config\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.282347 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-serving-cert\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.293978 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rkd5\" (UniqueName: \"kubernetes.io/projected/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-kube-api-access-5rkd5\") pod \"route-controller-manager-7d446cccdf-7klkm\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.415894 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.521531 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:03:26 crc kubenswrapper[4958]: I0320 09:03:26.521657 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:03:28 crc kubenswrapper[4958]: E0320 09:03:28.930750 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 09:03:28 crc kubenswrapper[4958]: E0320 09:03:28.931769 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-sw8xr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-m4c8h_openshift-marketplace(d551e28f-f3d1-4135-bc78-f606120df286): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 09:03:28 crc kubenswrapper[4958]: E0320 09:03:28.933034 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-m4c8h" podUID="d551e28f-f3d1-4135-bc78-f606120df286" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.672910 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-m4c8h" podUID="d551e28f-f3d1-4135-bc78-f606120df286" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.798371 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.798806 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bzbd9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-549hv_openshift-marketplace(fcb5229f-2b8f-4e6a-8542-cd03b84e9737): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.799963 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-549hv" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.800029 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.800289 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bjdx2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-875rt_openshift-marketplace(96818d4d-0c37-4c66-9f05-70d41cefa01d): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 09:03:30 crc kubenswrapper[4958]: E0320 09:03:30.801504 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-875rt" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" Mar 20 09:03:31 crc kubenswrapper[4958]: I0320 09:03:31.218518 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 09:03:32 crc kubenswrapper[4958]: E0320 09:03:32.583624 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-549hv" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" Mar 20 09:03:32 crc kubenswrapper[4958]: E0320 09:03:32.583798 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-875rt" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" Mar 20 09:03:32 crc kubenswrapper[4958]: E0320 09:03:32.652770 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 09:03:32 crc kubenswrapper[4958]: E0320 09:03:32.653354 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-bsgc6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-9xwld_openshift-marketplace(f21e8593-4125-4ea1-ad7f-be4bb994ed6e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 09:03:32 crc kubenswrapper[4958]: E0320 09:03:32.654625 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-9xwld" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" Mar 20 09:03:32 crc kubenswrapper[4958]: I0320 09:03:32.776697 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 09:03:34 crc kubenswrapper[4958]: I0320 09:03:34.184131 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:34 crc kubenswrapper[4958]: I0320 09:03:34.184824 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:36 crc kubenswrapper[4958]: I0320 09:03:36.519974 4958 scope.go:117] "RemoveContainer" containerID="510b557960e38e3dab6aa02adee1fe5860c55540dda48794a9b24bc46dcd63f8" Mar 20 09:03:36 crc kubenswrapper[4958]: W0320 09:03:36.548849 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod8b9f27e1_cd97_48d0_9abc_9bc4059f4b44.slice/crio-7d623d5132af76f9cfdc457531c7941fe359143c7f99efbe257da706b9795a54 WatchSource:0}: Error finding container 7d623d5132af76f9cfdc457531c7941fe359143c7f99efbe257da706b9795a54: Status 404 returned error can't find the container with id 7d623d5132af76f9cfdc457531c7941fe359143c7f99efbe257da706b9795a54 Mar 20 09:03:36 crc kubenswrapper[4958]: W0320 09:03:36.553210 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5a14add9_9e6f_4731_a8e3_fbcc968ccdf4.slice/crio-2dd83935f6142c4117d865a12d8f759eb035a56db0e98fbc184136420053e8f9 WatchSource:0}: Error finding container 2dd83935f6142c4117d865a12d8f759eb035a56db0e98fbc184136420053e8f9: Status 404 returned error can't find the container with id 2dd83935f6142c4117d865a12d8f759eb035a56db0e98fbc184136420053e8f9 Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.562067 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-9xwld" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.616402 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.616651 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2fps,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-smdkg_openshift-marketplace(faa90514-f83a-442b-9d17-08ff904728f2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.617905 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-smdkg" podUID="faa90514-f83a-442b-9d17-08ff904728f2" Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.660865 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.661524 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z8kh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-z8j2r_openshift-marketplace(c97ca1fb-e042-4273-b024-bc9dbc806359): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 09:03:36 crc kubenswrapper[4958]: E0320 09:03:36.662845 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-z8j2r" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" Mar 20 09:03:36 crc kubenswrapper[4958]: I0320 09:03:36.857754 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm"] Mar 20 09:03:36 crc kubenswrapper[4958]: W0320 09:03:36.900500 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd446a2a1_7f8f_4226_bf15_b108fbe3fbf5.slice/crio-207a9a0a56e06745ff3f2ad51e462ed0683bee0094a19e6d5a0c9f2d3f311e4b WatchSource:0}: Error finding container 207a9a0a56e06745ff3f2ad51e462ed0683bee0094a19e6d5a0c9f2d3f311e4b: Status 404 returned error can't find the container with id 207a9a0a56e06745ff3f2ad51e462ed0683bee0094a19e6d5a0c9f2d3f311e4b Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.118299 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-86f549b547-kgjx5"] Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.121820 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44","Type":"ContainerStarted","Data":"7d623d5132af76f9cfdc457531c7941fe359143c7f99efbe257da706b9795a54"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.130539 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"719394eb0ebc98d4b54c81a48c8d5c0e3331ec01ae6643e639a7dae0b3f58c29"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.130664 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"0d2bda95c9fbe4ac34d1bfee2cea4cb7fe6ce8a2a77acc2d23491de6d54e06f4"} Mar 20 09:03:37 crc kubenswrapper[4958]: W0320 09:03:37.133575 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4ca2365c_0c71_4e08_a835_5d3a609b8e0a.slice/crio-89b374b9882d9ad204422b26c9d38ce7ba369a1bf3e1b780fdeae30d54c5f058 WatchSource:0}: Error finding container 89b374b9882d9ad204422b26c9d38ce7ba369a1bf3e1b780fdeae30d54c5f058: Status 404 returned error can't find the container with id 89b374b9882d9ad204422b26c9d38ce7ba369a1bf3e1b780fdeae30d54c5f058 Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.134207 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" event={"ID":"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5","Type":"ContainerStarted","Data":"39d4fa8e6dd74cdac27fa0aefbdcef8892bb2ebd7a7592b6d7d7c18c56239116"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.134271 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" event={"ID":"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5","Type":"ContainerStarted","Data":"207a9a0a56e06745ff3f2ad51e462ed0683bee0094a19e6d5a0c9f2d3f311e4b"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.134886 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.137356 4958 patch_prober.go:28] interesting pod/route-controller-manager-7d446cccdf-7klkm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" start-of-body= Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.137448 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" podUID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.61:8443/healthz\": dial tcp 10.217.0.61:8443: connect: connection refused" Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.153141 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-xpvqq" event={"ID":"a2ac2e2b-d19a-413b-9cfc-c1a8ca008006","Type":"ContainerStarted","Data":"ef769b74b1781dc612eadaaf03d455f00e2bd0a47f68d19000bcf386e2808153"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.154754 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.166717 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.166807 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.178432 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"a0c6fb3ad84857d3ef695b4aad1117ab8886852d1a81dba4c7b59c9604f5edd9"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.178483 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"9d4e84a6b1cbd742310aba405e2e23b1043731039802f25545e5eb4acd9fbc28"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.179008 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.187760 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4","Type":"ContainerStarted","Data":"2dd83935f6142c4117d865a12d8f759eb035a56db0e98fbc184136420053e8f9"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.211832 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"ae60cbdf50abbb9def2086c5045291feea914724add892ffec27abd9c73e9ca9"} Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.211885 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c88f316316c925a59883c8d2b0ca404148a5b0080a3cd21ce47c00ac448afcee"} Mar 20 09:03:37 crc kubenswrapper[4958]: E0320 09:03:37.219917 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-smdkg" podUID="faa90514-f83a-442b-9d17-08ff904728f2" Mar 20 09:03:37 crc kubenswrapper[4958]: E0320 09:03:37.220023 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-z8j2r" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" Mar 20 09:03:37 crc kubenswrapper[4958]: I0320 09:03:37.239243 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" podStartSLOduration=21.23920055 podStartE2EDuration="21.23920055s" podCreationTimestamp="2026-03-20 09:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:03:37.238207641 +0000 UTC m=+237.560223599" watchObservedRunningTime="2026-03-20 09:03:37.23920055 +0000 UTC m=+237.561216508" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.022732 4958 csr.go:261] certificate signing request csr-68lm9 is approved, waiting to be issued Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.028341 4958 csr.go:257] certificate signing request csr-68lm9 is issued Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.229915 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" event={"ID":"375c7798-d728-48b0-ac0d-27ba8f57a393","Type":"ContainerStarted","Data":"518ef97b7142a906c9a60e6043be113540c5683a89c2b005ab6356d5fae86135"} Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.230966 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44","Type":"ContainerStarted","Data":"122c33760731e76c7ebbb28c513f405061ecaa4dee7fcbb9f73f16085ddc4508"} Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.232355 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" event={"ID":"4ca2365c-0c71-4e08-a835-5d3a609b8e0a","Type":"ContainerStarted","Data":"076b4f83e9b374d3ee8ea6323f43216527657725dfd99029b972f8638d5562e8"} Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.232389 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" event={"ID":"4ca2365c-0c71-4e08-a835-5d3a609b8e0a","Type":"ContainerStarted","Data":"89b374b9882d9ad204422b26c9d38ce7ba369a1bf3e1b780fdeae30d54c5f058"} Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.232893 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.234499 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerStarted","Data":"b0b56e981b3dca165ff19e6b74900926c1d0c14b8697e35b982049aa89a67714"} Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.236196 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4","Type":"ContainerStarted","Data":"cca4f5952dcb71862ff0095255bebd5da75edc590cb4c598b03671cbf02f988d"} Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.237917 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.237974 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.240734 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.240917 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.248530 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" podStartSLOduration=38.998892815 podStartE2EDuration="1m38.248509061s" podCreationTimestamp="2026-03-20 09:02:00 +0000 UTC" firstStartedPulling="2026-03-20 09:02:37.518308806 +0000 UTC m=+177.840324764" lastFinishedPulling="2026-03-20 09:03:36.767925052 +0000 UTC m=+237.089941010" observedRunningTime="2026-03-20 09:03:38.247710707 +0000 UTC m=+238.569726665" watchObservedRunningTime="2026-03-20 09:03:38.248509061 +0000 UTC m=+238.570525019" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.268206 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=24.268176673 podStartE2EDuration="24.268176673s" podCreationTimestamp="2026-03-20 09:03:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:03:38.268112861 +0000 UTC m=+238.590128809" watchObservedRunningTime="2026-03-20 09:03:38.268176673 +0000 UTC m=+238.590192631" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.289757 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" podStartSLOduration=22.289727842 podStartE2EDuration="22.289727842s" podCreationTimestamp="2026-03-20 09:03:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:03:38.288494085 +0000 UTC m=+238.610510043" watchObservedRunningTime="2026-03-20 09:03:38.289727842 +0000 UTC m=+238.611743800" Mar 20 09:03:38 crc kubenswrapper[4958]: I0320 09:03:38.307732 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=19.307705152 podStartE2EDuration="19.307705152s" podCreationTimestamp="2026-03-20 09:03:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:03:38.304171654 +0000 UTC m=+238.626187612" watchObservedRunningTime="2026-03-20 09:03:38.307705152 +0000 UTC m=+238.629721110" Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.030031 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-12-29 03:15:08.980573322 +0000 UTC Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.030432 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6810h11m29.950143736s for next certificate rotation Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.243774 4958 generic.go:334] "Generic (PLEG): container finished" podID="5a14add9-9e6f-4731-a8e3-fbcc968ccdf4" containerID="cca4f5952dcb71862ff0095255bebd5da75edc590cb4c598b03671cbf02f988d" exitCode=0 Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.243867 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4","Type":"ContainerDied","Data":"cca4f5952dcb71862ff0095255bebd5da75edc590cb4c598b03671cbf02f988d"} Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.246633 4958 generic.go:334] "Generic (PLEG): container finished" podID="375c7798-d728-48b0-ac0d-27ba8f57a393" containerID="518ef97b7142a906c9a60e6043be113540c5683a89c2b005ab6356d5fae86135" exitCode=0 Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.247719 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" event={"ID":"375c7798-d728-48b0-ac0d-27ba8f57a393","Type":"ContainerDied","Data":"518ef97b7142a906c9a60e6043be113540c5683a89c2b005ab6356d5fae86135"} Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.248375 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:39 crc kubenswrapper[4958]: I0320 09:03:39.248425 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:40 crc kubenswrapper[4958]: I0320 09:03:40.031538 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-20 02:07:50.017247561 +0000 UTC Mar 20 09:03:40 crc kubenswrapper[4958]: I0320 09:03:40.031628 4958 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7337h4m9.985657003s for next certificate rotation Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.261534 4958 generic.go:334] "Generic (PLEG): container finished" podID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerID="b0b56e981b3dca165ff19e6b74900926c1d0c14b8697e35b982049aa89a67714" exitCode=0 Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.261628 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerDied","Data":"b0b56e981b3dca165ff19e6b74900926c1d0c14b8697e35b982049aa89a67714"} Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.755698 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.769846 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.882576 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kube-api-access\") pod \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.882714 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rds2h\" (UniqueName: \"kubernetes.io/projected/375c7798-d728-48b0-ac0d-27ba8f57a393-kube-api-access-rds2h\") pod \"375c7798-d728-48b0-ac0d-27ba8f57a393\" (UID: \"375c7798-d728-48b0-ac0d-27ba8f57a393\") " Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.882768 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kubelet-dir\") pod \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\" (UID: \"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4\") " Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.883049 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "5a14add9-9e6f-4731-a8e3-fbcc968ccdf4" (UID: "5a14add9-9e6f-4731-a8e3-fbcc968ccdf4"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.894728 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/375c7798-d728-48b0-ac0d-27ba8f57a393-kube-api-access-rds2h" (OuterVolumeSpecName: "kube-api-access-rds2h") pod "375c7798-d728-48b0-ac0d-27ba8f57a393" (UID: "375c7798-d728-48b0-ac0d-27ba8f57a393"). InnerVolumeSpecName "kube-api-access-rds2h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.895587 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "5a14add9-9e6f-4731-a8e3-fbcc968ccdf4" (UID: "5a14add9-9e6f-4731-a8e3-fbcc968ccdf4"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.984458 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.984535 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/5a14add9-9e6f-4731-a8e3-fbcc968ccdf4-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:41 crc kubenswrapper[4958]: I0320 09:03:41.984558 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rds2h\" (UniqueName: \"kubernetes.io/projected/375c7798-d728-48b0-ac0d-27ba8f57a393-kube-api-access-rds2h\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:42 crc kubenswrapper[4958]: I0320 09:03:42.277288 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" event={"ID":"375c7798-d728-48b0-ac0d-27ba8f57a393","Type":"ContainerDied","Data":"e41a2b806757d534dbbfbefeffcde58161f29ba9d8fa9b754d3558a05536b84d"} Mar 20 09:03:42 crc kubenswrapper[4958]: I0320 09:03:42.277344 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e41a2b806757d534dbbfbefeffcde58161f29ba9d8fa9b754d3558a05536b84d" Mar 20 09:03:42 crc kubenswrapper[4958]: I0320 09:03:42.277352 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566622-xd9xt" Mar 20 09:03:42 crc kubenswrapper[4958]: I0320 09:03:42.279571 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"5a14add9-9e6f-4731-a8e3-fbcc968ccdf4","Type":"ContainerDied","Data":"2dd83935f6142c4117d865a12d8f759eb035a56db0e98fbc184136420053e8f9"} Mar 20 09:03:42 crc kubenswrapper[4958]: I0320 09:03:42.279627 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2dd83935f6142c4117d865a12d8f759eb035a56db0e98fbc184136420053e8f9" Mar 20 09:03:42 crc kubenswrapper[4958]: I0320 09:03:42.279714 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 09:03:43 crc kubenswrapper[4958]: I0320 09:03:43.295384 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerStarted","Data":"d112343654e8ece2c555f721784929b792585a044f3751aed69efac0755581df"} Mar 20 09:03:44 crc kubenswrapper[4958]: I0320 09:03:44.185157 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:44 crc kubenswrapper[4958]: I0320 09:03:44.185174 4958 patch_prober.go:28] interesting pod/downloads-7954f5f757-xpvqq container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" start-of-body= Mar 20 09:03:44 crc kubenswrapper[4958]: I0320 09:03:44.185252 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:44 crc kubenswrapper[4958]: I0320 09:03:44.185259 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-xpvqq" podUID="a2ac2e2b-d19a-413b-9cfc-c1a8ca008006" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.43:8080/\": dial tcp 10.217.0.43:8080: connect: connection refused" Mar 20 09:03:44 crc kubenswrapper[4958]: I0320 09:03:44.302844 4958 generic.go:334] "Generic (PLEG): container finished" podID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerID="d112343654e8ece2c555f721784929b792585a044f3751aed69efac0755581df" exitCode=0 Mar 20 09:03:44 crc kubenswrapper[4958]: I0320 09:03:44.302901 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerDied","Data":"d112343654e8ece2c555f721784929b792585a044f3751aed69efac0755581df"} Mar 20 09:03:46 crc kubenswrapper[4958]: I0320 09:03:46.317530 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerStarted","Data":"202b742be89e34126fdc698910c0c455020ed04a9bf75db6b4a611df61c176d8"} Mar 20 09:03:46 crc kubenswrapper[4958]: I0320 09:03:46.340715 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p5nh9" podStartSLOduration=4.930684815 podStartE2EDuration="1m2.340684473s" podCreationTimestamp="2026-03-20 09:02:44 +0000 UTC" firstStartedPulling="2026-03-20 09:02:47.580115132 +0000 UTC m=+187.902131090" lastFinishedPulling="2026-03-20 09:03:44.99011479 +0000 UTC m=+245.312130748" observedRunningTime="2026-03-20 09:03:46.336200605 +0000 UTC m=+246.658216563" watchObservedRunningTime="2026-03-20 09:03:46.340684473 +0000 UTC m=+246.662700441" Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.374524 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerStarted","Data":"c4fd84794f34339505882babeedccf0842923e38187c141049a17bb5913860b5"} Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.378802 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerStarted","Data":"fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb"} Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.380633 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerStarted","Data":"1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a"} Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.381951 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerStarted","Data":"c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c"} Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.384786 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerStarted","Data":"1daa1aaf3b5fe03ebea9132c909cc38da98e4a17208c0b5b1ba83ee0358929b0"} Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.480701 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-mpjsp" podStartSLOduration=2.893543756 podStartE2EDuration="1m11.480675985s" podCreationTimestamp="2026-03-20 09:02:41 +0000 UTC" firstStartedPulling="2026-03-20 09:02:43.040026981 +0000 UTC m=+183.362042939" lastFinishedPulling="2026-03-20 09:03:51.62715921 +0000 UTC m=+251.949175168" observedRunningTime="2026-03-20 09:03:52.479320534 +0000 UTC m=+252.801336512" watchObservedRunningTime="2026-03-20 09:03:52.480675985 +0000 UTC m=+252.802691943" Mar 20 09:03:52 crc kubenswrapper[4958]: I0320 09:03:52.704079 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-6j2mb"] Mar 20 09:03:53 crc kubenswrapper[4958]: I0320 09:03:53.404514 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerStarted","Data":"863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96"} Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.195117 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-xpvqq" Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.420875 4958 generic.go:334] "Generic (PLEG): container finished" podID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerID="1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a" exitCode=0 Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.421003 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerDied","Data":"1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a"} Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.433789 4958 generic.go:334] "Generic (PLEG): container finished" podID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerID="c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c" exitCode=0 Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.433922 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerDied","Data":"c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c"} Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.440235 4958 generic.go:334] "Generic (PLEG): container finished" podID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerID="c4fd84794f34339505882babeedccf0842923e38187c141049a17bb5913860b5" exitCode=0 Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.448538 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerDied","Data":"c4fd84794f34339505882babeedccf0842923e38187c141049a17bb5913860b5"} Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.459060 4958 generic.go:334] "Generic (PLEG): container finished" podID="d551e28f-f3d1-4135-bc78-f606120df286" containerID="fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb" exitCode=0 Mar 20 09:03:54 crc kubenswrapper[4958]: I0320 09:03:54.459113 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerDied","Data":"fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb"} Mar 20 09:03:55 crc kubenswrapper[4958]: I0320 09:03:55.003414 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:03:55 crc kubenswrapper[4958]: I0320 09:03:55.004346 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:03:55 crc kubenswrapper[4958]: I0320 09:03:55.875523 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.473987 4958 generic.go:334] "Generic (PLEG): container finished" podID="faa90514-f83a-442b-9d17-08ff904728f2" containerID="863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96" exitCode=0 Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.474096 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerDied","Data":"863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96"} Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.521868 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.521951 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.526589 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.584297 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86f549b547-kgjx5"] Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.585105 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" podUID="4ca2365c-0c71-4e08-a835-5d3a609b8e0a" containerName="controller-manager" containerID="cri-o://076b4f83e9b374d3ee8ea6323f43216527657725dfd99029b972f8638d5562e8" gracePeriod=30 Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.673704 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm"] Mar 20 09:03:56 crc kubenswrapper[4958]: I0320 09:03:56.673996 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" podUID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" containerName="route-controller-manager" containerID="cri-o://39d4fa8e6dd74cdac27fa0aefbdcef8892bb2ebd7a7592b6d7d7c18c56239116" gracePeriod=30 Mar 20 09:03:57 crc kubenswrapper[4958]: I0320 09:03:57.492673 4958 generic.go:334] "Generic (PLEG): container finished" podID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" containerID="39d4fa8e6dd74cdac27fa0aefbdcef8892bb2ebd7a7592b6d7d7c18c56239116" exitCode=0 Mar 20 09:03:57 crc kubenswrapper[4958]: I0320 09:03:57.492796 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" event={"ID":"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5","Type":"ContainerDied","Data":"39d4fa8e6dd74cdac27fa0aefbdcef8892bb2ebd7a7592b6d7d7c18c56239116"} Mar 20 09:03:57 crc kubenswrapper[4958]: I0320 09:03:57.496357 4958 generic.go:334] "Generic (PLEG): container finished" podID="4ca2365c-0c71-4e08-a835-5d3a609b8e0a" containerID="076b4f83e9b374d3ee8ea6323f43216527657725dfd99029b972f8638d5562e8" exitCode=0 Mar 20 09:03:57 crc kubenswrapper[4958]: I0320 09:03:57.496536 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" event={"ID":"4ca2365c-0c71-4e08-a835-5d3a609b8e0a","Type":"ContainerDied","Data":"076b4f83e9b374d3ee8ea6323f43216527657725dfd99029b972f8638d5562e8"} Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.576067 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.615096 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-55684fd5db-k5x7k"] Mar 20 09:03:58 crc kubenswrapper[4958]: E0320 09:03:58.615899 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca2365c-0c71-4e08-a835-5d3a609b8e0a" containerName="controller-manager" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.615988 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca2365c-0c71-4e08-a835-5d3a609b8e0a" containerName="controller-manager" Mar 20 09:03:58 crc kubenswrapper[4958]: E0320 09:03:58.616090 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a14add9-9e6f-4731-a8e3-fbcc968ccdf4" containerName="pruner" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.616167 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a14add9-9e6f-4731-a8e3-fbcc968ccdf4" containerName="pruner" Mar 20 09:03:58 crc kubenswrapper[4958]: E0320 09:03:58.616258 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="375c7798-d728-48b0-ac0d-27ba8f57a393" containerName="oc" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.616341 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="375c7798-d728-48b0-ac0d-27ba8f57a393" containerName="oc" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.616616 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca2365c-0c71-4e08-a835-5d3a609b8e0a" containerName="controller-manager" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.616709 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="375c7798-d728-48b0-ac0d-27ba8f57a393" containerName="oc" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.616803 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a14add9-9e6f-4731-a8e3-fbcc968ccdf4" containerName="pruner" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.617395 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.635894 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55684fd5db-k5x7k"] Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.668539 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-config\") pod \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.668745 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-99vcv\" (UniqueName: \"kubernetes.io/projected/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-kube-api-access-99vcv\") pod \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.668826 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-serving-cert\") pod \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.668857 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-proxy-ca-bundles\") pod \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.668886 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-client-ca\") pod \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\" (UID: \"4ca2365c-0c71-4e08-a835-5d3a609b8e0a\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.669283 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f9hdv\" (UniqueName: \"kubernetes.io/projected/a0ca83ae-d916-4c7e-8887-fc12170212fd-kube-api-access-f9hdv\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.669342 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ca83ae-d916-4c7e-8887-fc12170212fd-serving-cert\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.669366 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-client-ca\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.669396 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-proxy-ca-bundles\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.669455 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-config\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.669867 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-config" (OuterVolumeSpecName: "config") pod "4ca2365c-0c71-4e08-a835-5d3a609b8e0a" (UID: "4ca2365c-0c71-4e08-a835-5d3a609b8e0a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.671015 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4ca2365c-0c71-4e08-a835-5d3a609b8e0a" (UID: "4ca2365c-0c71-4e08-a835-5d3a609b8e0a"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.671399 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-client-ca" (OuterVolumeSpecName: "client-ca") pod "4ca2365c-0c71-4e08-a835-5d3a609b8e0a" (UID: "4ca2365c-0c71-4e08-a835-5d3a609b8e0a"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.676444 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-kube-api-access-99vcv" (OuterVolumeSpecName: "kube-api-access-99vcv") pod "4ca2365c-0c71-4e08-a835-5d3a609b8e0a" (UID: "4ca2365c-0c71-4e08-a835-5d3a609b8e0a"). InnerVolumeSpecName "kube-api-access-99vcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.688384 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4ca2365c-0c71-4e08-a835-5d3a609b8e0a" (UID: "4ca2365c-0c71-4e08-a835-5d3a609b8e0a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.770849 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ca83ae-d916-4c7e-8887-fc12170212fd-serving-cert\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.770900 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-client-ca\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.770935 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-proxy-ca-bundles\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.770966 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-config\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.771050 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f9hdv\" (UniqueName: \"kubernetes.io/projected/a0ca83ae-d916-4c7e-8887-fc12170212fd-kube-api-access-f9hdv\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.771098 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.771109 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-99vcv\" (UniqueName: \"kubernetes.io/projected/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-kube-api-access-99vcv\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.771125 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.771136 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.771147 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4ca2365c-0c71-4e08-a835-5d3a609b8e0a-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.772484 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-client-ca\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.772920 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-proxy-ca-bundles\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.773633 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-config\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.780902 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ca83ae-d916-4c7e-8887-fc12170212fd-serving-cert\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.790850 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f9hdv\" (UniqueName: \"kubernetes.io/projected/a0ca83ae-d916-4c7e-8887-fc12170212fd-kube-api-access-f9hdv\") pod \"controller-manager-55684fd5db-k5x7k\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.814282 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.872480 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-client-ca\") pod \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.872675 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5rkd5\" (UniqueName: \"kubernetes.io/projected/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-kube-api-access-5rkd5\") pod \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.872742 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-config\") pod \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.872796 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-serving-cert\") pod \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\" (UID: \"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5\") " Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.873718 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-client-ca" (OuterVolumeSpecName: "client-ca") pod "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" (UID: "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.873893 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-config" (OuterVolumeSpecName: "config") pod "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" (UID: "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.876322 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" (UID: "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.876349 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-kube-api-access-5rkd5" (OuterVolumeSpecName: "kube-api-access-5rkd5") pod "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" (UID: "d446a2a1-7f8f-4226-bf15-b108fbe3fbf5"). InnerVolumeSpecName "kube-api-access-5rkd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.935949 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.974561 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5rkd5\" (UniqueName: \"kubernetes.io/projected/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-kube-api-access-5rkd5\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.974627 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.974642 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:58 crc kubenswrapper[4958]: I0320 09:03:58.974653 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.512176 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" event={"ID":"4ca2365c-0c71-4e08-a835-5d3a609b8e0a","Type":"ContainerDied","Data":"89b374b9882d9ad204422b26c9d38ce7ba369a1bf3e1b780fdeae30d54c5f058"} Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.512255 4958 scope.go:117] "RemoveContainer" containerID="076b4f83e9b374d3ee8ea6323f43216527657725dfd99029b972f8638d5562e8" Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.512248 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-86f549b547-kgjx5" Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.514906 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" event={"ID":"d446a2a1-7f8f-4226-bf15-b108fbe3fbf5","Type":"ContainerDied","Data":"207a9a0a56e06745ff3f2ad51e462ed0683bee0094a19e6d5a0c9f2d3f311e4b"} Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.514954 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm" Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.583711 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-86f549b547-kgjx5"] Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.588454 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-86f549b547-kgjx5"] Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.598051 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm"] Mar 20 09:03:59 crc kubenswrapper[4958]: I0320 09:03:59.600924 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7d446cccdf-7klkm"] Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.134283 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566624-gtbp8"] Mar 20 09:04:00 crc kubenswrapper[4958]: E0320 09:04:00.134589 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" containerName="route-controller-manager" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.134797 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" containerName="route-controller-manager" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.134984 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" containerName="route-controller-manager" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.135440 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.137509 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.140203 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.140373 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.142332 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-gtbp8"] Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.194541 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjwn7\" (UniqueName: \"kubernetes.io/projected/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab-kube-api-access-zjwn7\") pod \"auto-csr-approver-29566624-gtbp8\" (UID: \"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab\") " pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.226658 4958 scope.go:117] "RemoveContainer" containerID="39d4fa8e6dd74cdac27fa0aefbdcef8892bb2ebd7a7592b6d7d7c18c56239116" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.296747 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjwn7\" (UniqueName: \"kubernetes.io/projected/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab-kube-api-access-zjwn7\") pod \"auto-csr-approver-29566624-gtbp8\" (UID: \"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab\") " pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.320544 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjwn7\" (UniqueName: \"kubernetes.io/projected/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab-kube-api-access-zjwn7\") pod \"auto-csr-approver-29566624-gtbp8\" (UID: \"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab\") " pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.445650 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca2365c-0c71-4e08-a835-5d3a609b8e0a" path="/var/lib/kubelet/pods/4ca2365c-0c71-4e08-a835-5d3a609b8e0a/volumes" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.446731 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d446a2a1-7f8f-4226-bf15-b108fbe3fbf5" path="/var/lib/kubelet/pods/d446a2a1-7f8f-4226-bf15-b108fbe3fbf5/volumes" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.464480 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.473072 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-55684fd5db-k5x7k"] Mar 20 09:04:00 crc kubenswrapper[4958]: W0320 09:04:00.479508 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda0ca83ae_d916_4c7e_8887_fc12170212fd.slice/crio-be28575f79f07930dd031481c31b867ff897590ff2fbec5fe04bbfd21e8ceec3 WatchSource:0}: Error finding container be28575f79f07930dd031481c31b867ff897590ff2fbec5fe04bbfd21e8ceec3: Status 404 returned error can't find the container with id be28575f79f07930dd031481c31b867ff897590ff2fbec5fe04bbfd21e8ceec3 Mar 20 09:04:00 crc kubenswrapper[4958]: I0320 09:04:00.524440 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" event={"ID":"a0ca83ae-d916-4c7e-8887-fc12170212fd","Type":"ContainerStarted","Data":"be28575f79f07930dd031481c31b867ff897590ff2fbec5fe04bbfd21e8ceec3"} Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.119296 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl"] Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.120744 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.124229 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.124579 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.124795 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.126064 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.126372 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.126694 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.133090 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl"] Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.217338 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdcdv\" (UniqueName: \"kubernetes.io/projected/2f790d13-f747-4a01-9f2b-87d60076c10d-kube-api-access-xdcdv\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.217629 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-client-ca\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.217926 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-config\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.218211 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f790d13-f747-4a01-9f2b-87d60076c10d-serving-cert\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.319850 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-config\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.320222 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f790d13-f747-4a01-9f2b-87d60076c10d-serving-cert\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.320347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xdcdv\" (UniqueName: \"kubernetes.io/projected/2f790d13-f747-4a01-9f2b-87d60076c10d-kube-api-access-xdcdv\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.320433 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-client-ca\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.321439 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-config\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.321521 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-client-ca\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.326964 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f790d13-f747-4a01-9f2b-87d60076c10d-serving-cert\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.352586 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdcdv\" (UniqueName: \"kubernetes.io/projected/2f790d13-f747-4a01-9f2b-87d60076c10d-kube-api-access-xdcdv\") pod \"route-controller-manager-fdb855975-5dnbl\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.439221 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.534031 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerStarted","Data":"dd0b3d0163aacce5568211b8e740b7799f69206fdf5b2d578b6025241d9500e1"} Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.538589 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerStarted","Data":"5dd344b339bab4611f3abc5ecba6047e2b5ae1eddda5fbba4934c898d7451834"} Mar 20 09:04:01 crc kubenswrapper[4958]: I0320 09:04:01.542702 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerStarted","Data":"d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14"} Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.083524 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.083641 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.138897 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.585655 4958 generic.go:334] "Generic (PLEG): container finished" podID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerID="dd0b3d0163aacce5568211b8e740b7799f69206fdf5b2d578b6025241d9500e1" exitCode=0 Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.590335 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerDied","Data":"dd0b3d0163aacce5568211b8e740b7799f69206fdf5b2d578b6025241d9500e1"} Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.617928 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9xwld" podStartSLOduration=5.891901298 podStartE2EDuration="1m19.617908757s" podCreationTimestamp="2026-03-20 09:02:43 +0000 UTC" firstStartedPulling="2026-03-20 09:02:46.500998925 +0000 UTC m=+186.823014883" lastFinishedPulling="2026-03-20 09:04:00.227006394 +0000 UTC m=+260.549022342" observedRunningTime="2026-03-20 09:04:02.614971138 +0000 UTC m=+262.936987096" watchObservedRunningTime="2026-03-20 09:04:02.617908757 +0000 UTC m=+262.939924715" Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.692476 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.700460 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-875rt" podStartSLOduration=4.884759305 podStartE2EDuration="1m21.700426303s" podCreationTimestamp="2026-03-20 09:02:41 +0000 UTC" firstStartedPulling="2026-03-20 09:02:43.037493394 +0000 UTC m=+183.359509352" lastFinishedPulling="2026-03-20 09:03:59.853160372 +0000 UTC m=+260.175176350" observedRunningTime="2026-03-20 09:04:02.644044931 +0000 UTC m=+262.966060889" watchObservedRunningTime="2026-03-20 09:04:02.700426303 +0000 UTC m=+263.022442271" Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.848655 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-gtbp8"] Mar 20 09:04:02 crc kubenswrapper[4958]: I0320 09:04:02.951848 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl"] Mar 20 09:04:02 crc kubenswrapper[4958]: W0320 09:04:02.958504 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f790d13_f747_4a01_9f2b_87d60076c10d.slice/crio-7baa28af1d25f1b40ecaa4c4ce8e6bafded1ac65c70fb2d8bcdb0137e52b3252 WatchSource:0}: Error finding container 7baa28af1d25f1b40ecaa4c4ce8e6bafded1ac65c70fb2d8bcdb0137e52b3252: Status 404 returned error can't find the container with id 7baa28af1d25f1b40ecaa4c4ce8e6bafded1ac65c70fb2d8bcdb0137e52b3252 Mar 20 09:04:03 crc kubenswrapper[4958]: I0320 09:04:03.595535 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerStarted","Data":"ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830"} Mar 20 09:04:03 crc kubenswrapper[4958]: I0320 09:04:03.597293 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" event={"ID":"2f790d13-f747-4a01-9f2b-87d60076c10d","Type":"ContainerStarted","Data":"7baa28af1d25f1b40ecaa4c4ce8e6bafded1ac65c70fb2d8bcdb0137e52b3252"} Mar 20 09:04:03 crc kubenswrapper[4958]: I0320 09:04:03.600333 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" event={"ID":"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab","Type":"ContainerStarted","Data":"966af4fad8c323bda908d71547289980e79dfb9e3f9ed5f50d52a1785af45685"} Mar 20 09:04:03 crc kubenswrapper[4958]: I0320 09:04:03.602617 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" event={"ID":"a0ca83ae-d916-4c7e-8887-fc12170212fd","Type":"ContainerStarted","Data":"53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6"} Mar 20 09:04:04 crc kubenswrapper[4958]: I0320 09:04:04.128626 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:04:04 crc kubenswrapper[4958]: I0320 09:04:04.128707 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:04:04 crc kubenswrapper[4958]: I0320 09:04:04.175483 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:04:04 crc kubenswrapper[4958]: I0320 09:04:04.635904 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" podStartSLOduration=8.635880925 podStartE2EDuration="8.635880925s" podCreationTimestamp="2026-03-20 09:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:04:04.632659858 +0000 UTC m=+264.954675826" watchObservedRunningTime="2026-03-20 09:04:04.635880925 +0000 UTC m=+264.957896893" Mar 20 09:04:04 crc kubenswrapper[4958]: I0320 09:04:04.669968 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-549hv" podStartSLOduration=4.311445054 podStartE2EDuration="1m23.66994328s" podCreationTimestamp="2026-03-20 09:02:41 +0000 UTC" firstStartedPulling="2026-03-20 09:02:43.074857707 +0000 UTC m=+183.396873665" lastFinishedPulling="2026-03-20 09:04:02.433355933 +0000 UTC m=+262.755371891" observedRunningTime="2026-03-20 09:04:04.667853266 +0000 UTC m=+264.989869224" watchObservedRunningTime="2026-03-20 09:04:04.66994328 +0000 UTC m=+264.991959238" Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.672183 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerStarted","Data":"bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18"} Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.686677 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" event={"ID":"2f790d13-f747-4a01-9f2b-87d60076c10d","Type":"ContainerStarted","Data":"13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd"} Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.686776 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.695786 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerStarted","Data":"c7f214d447c87c57cf0d136d6a477d47b7637f0dfa344988ac59335bb40597b5"} Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.701758 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerStarted","Data":"d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a"} Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.702710 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-m4c8h" podStartSLOduration=3.670823442 podStartE2EDuration="1m24.70268926s" podCreationTimestamp="2026-03-20 09:02:41 +0000 UTC" firstStartedPulling="2026-03-20 09:02:44.126847154 +0000 UTC m=+184.448863112" lastFinishedPulling="2026-03-20 09:04:05.158712972 +0000 UTC m=+265.480728930" observedRunningTime="2026-03-20 09:04:05.697793012 +0000 UTC m=+266.019808980" watchObservedRunningTime="2026-03-20 09:04:05.70268926 +0000 UTC m=+266.024705218" Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.726512 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z8j2r" podStartSLOduration=2.866655998 podStartE2EDuration="1m22.726481063s" podCreationTimestamp="2026-03-20 09:02:43 +0000 UTC" firstStartedPulling="2026-03-20 09:02:45.436882377 +0000 UTC m=+185.758898335" lastFinishedPulling="2026-03-20 09:04:05.296707442 +0000 UTC m=+265.618723400" observedRunningTime="2026-03-20 09:04:05.722490551 +0000 UTC m=+266.044506529" watchObservedRunningTime="2026-03-20 09:04:05.726481063 +0000 UTC m=+266.048497021" Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.758101 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" podStartSLOduration=9.758074902 podStartE2EDuration="9.758074902s" podCreationTimestamp="2026-03-20 09:03:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:04:05.753263286 +0000 UTC m=+266.075279244" watchObservedRunningTime="2026-03-20 09:04:05.758074902 +0000 UTC m=+266.080090860" Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.777766 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smdkg" podStartSLOduration=4.146052213 podStartE2EDuration="1m21.77774199s" podCreationTimestamp="2026-03-20 09:02:44 +0000 UTC" firstStartedPulling="2026-03-20 09:02:47.557033956 +0000 UTC m=+187.879049914" lastFinishedPulling="2026-03-20 09:04:05.188723733 +0000 UTC m=+265.510739691" observedRunningTime="2026-03-20 09:04:05.776671537 +0000 UTC m=+266.098687505" watchObservedRunningTime="2026-03-20 09:04:05.77774199 +0000 UTC m=+266.099757938" Mar 20 09:04:05 crc kubenswrapper[4958]: I0320 09:04:05.859565 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:08 crc kubenswrapper[4958]: I0320 09:04:08.733255 4958 generic.go:334] "Generic (PLEG): container finished" podID="a2a79103-8b2b-4ac4-88b0-e03a82ead6ab" containerID="e924a73bca1630d3b50cbb2a554091a99a53e0141904466e9ffe481daed22d71" exitCode=0 Mar 20 09:04:08 crc kubenswrapper[4958]: I0320 09:04:08.733327 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" event={"ID":"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab","Type":"ContainerDied","Data":"e924a73bca1630d3b50cbb2a554091a99a53e0141904466e9ffe481daed22d71"} Mar 20 09:04:08 crc kubenswrapper[4958]: I0320 09:04:08.936687 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:04:08 crc kubenswrapper[4958]: I0320 09:04:08.941584 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.112821 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.215706 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjwn7\" (UniqueName: \"kubernetes.io/projected/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab-kube-api-access-zjwn7\") pod \"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab\" (UID: \"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab\") " Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.223328 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab-kube-api-access-zjwn7" (OuterVolumeSpecName: "kube-api-access-zjwn7") pod "a2a79103-8b2b-4ac4-88b0-e03a82ead6ab" (UID: "a2a79103-8b2b-4ac4-88b0-e03a82ead6ab"). InnerVolumeSpecName "kube-api-access-zjwn7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.317286 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjwn7\" (UniqueName: \"kubernetes.io/projected/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab-kube-api-access-zjwn7\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.751440 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" event={"ID":"a2a79103-8b2b-4ac4-88b0-e03a82ead6ab","Type":"ContainerDied","Data":"966af4fad8c323bda908d71547289980e79dfb9e3f9ed5f50d52a1785af45685"} Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.751500 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="966af4fad8c323bda908d71547289980e79dfb9e3f9ed5f50d52a1785af45685" Mar 20 09:04:10 crc kubenswrapper[4958]: I0320 09:04:10.751568 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566624-gtbp8" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.534255 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.537418 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.597367 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.807914 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.918886 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.919365 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:04:11 crc kubenswrapper[4958]: I0320 09:04:11.962163 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:04:12 crc kubenswrapper[4958]: I0320 09:04:12.328944 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:04:12 crc kubenswrapper[4958]: I0320 09:04:12.329047 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:04:12 crc kubenswrapper[4958]: I0320 09:04:12.381050 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:04:12 crc kubenswrapper[4958]: I0320 09:04:12.837958 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:04:12 crc kubenswrapper[4958]: I0320 09:04:12.838061 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:04:13 crc kubenswrapper[4958]: I0320 09:04:13.624288 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 09:04:13 crc kubenswrapper[4958]: I0320 09:04:13.737118 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:04:13 crc kubenswrapper[4958]: I0320 09:04:13.737181 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:04:13 crc kubenswrapper[4958]: I0320 09:04:13.787102 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:04:13 crc kubenswrapper[4958]: I0320 09:04:13.830239 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.167861 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.492509 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-875rt"] Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.695031 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-m4c8h"] Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.768661 4958 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.769078 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9" gracePeriod=15 Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.769123 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062" gracePeriod=15 Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.769238 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae" gracePeriod=15 Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.769234 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706" gracePeriod=15 Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.769298 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0" gracePeriod=15 Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.771491 4958 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.771966 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.771995 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772023 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772041 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772063 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772080 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772103 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772121 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772149 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772166 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772189 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772206 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772227 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772245 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772283 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2a79103-8b2b-4ac4-88b0-e03a82ead6ab" containerName="oc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772301 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2a79103-8b2b-4ac4-88b0-e03a82ead6ab" containerName="oc" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.772367 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772384 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772678 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772714 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2a79103-8b2b-4ac4-88b0-e03a82ead6ab" containerName="oc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772732 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772761 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772785 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772803 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772822 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772839 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.772858 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.773124 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.773149 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: E0320 09:04:14.773175 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.773192 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.773436 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.776177 4958 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.782485 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.784413 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-m4c8h" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="registry-server" containerID="cri-o://bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18" gracePeriod=2 Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.788456 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.887981 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.888091 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.888408 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.888533 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.888801 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.889050 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.889123 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.889247 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.990301 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.990720 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.990431 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.990755 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.990784 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.990936 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991007 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991071 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991325 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991457 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991485 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991663 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991684 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991705 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991817 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:14 crc kubenswrapper[4958]: I0320 09:04:14.991878 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.253414 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.254346 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.292166 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.292241 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.356848 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.357688 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.358192 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.398887 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-utilities\") pod \"d551e28f-f3d1-4135-bc78-f606120df286\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.398957 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sw8xr\" (UniqueName: \"kubernetes.io/projected/d551e28f-f3d1-4135-bc78-f606120df286-kube-api-access-sw8xr\") pod \"d551e28f-f3d1-4135-bc78-f606120df286\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.399044 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-catalog-content\") pod \"d551e28f-f3d1-4135-bc78-f606120df286\" (UID: \"d551e28f-f3d1-4135-bc78-f606120df286\") " Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.401605 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-utilities" (OuterVolumeSpecName: "utilities") pod "d551e28f-f3d1-4135-bc78-f606120df286" (UID: "d551e28f-f3d1-4135-bc78-f606120df286"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.407291 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d551e28f-f3d1-4135-bc78-f606120df286-kube-api-access-sw8xr" (OuterVolumeSpecName: "kube-api-access-sw8xr") pod "d551e28f-f3d1-4135-bc78-f606120df286" (UID: "d551e28f-f3d1-4135-bc78-f606120df286"). InnerVolumeSpecName "kube-api-access-sw8xr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.453232 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d551e28f-f3d1-4135-bc78-f606120df286" (UID: "d551e28f-f3d1-4135-bc78-f606120df286"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.500904 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.500951 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d551e28f-f3d1-4135-bc78-f606120df286-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.500964 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sw8xr\" (UniqueName: \"kubernetes.io/projected/d551e28f-f3d1-4135-bc78-f606120df286-kube-api-access-sw8xr\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.790093 4958 generic.go:334] "Generic (PLEG): container finished" podID="d551e28f-f3d1-4135-bc78-f606120df286" containerID="bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18" exitCode=0 Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.790161 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerDied","Data":"bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18"} Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.790226 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-m4c8h" event={"ID":"d551e28f-f3d1-4135-bc78-f606120df286","Type":"ContainerDied","Data":"31163c1a295b3b71a757b3f8ff3f62466d67e21e65a54f429130de2f7529fea1"} Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.790261 4958 scope.go:117] "RemoveContainer" containerID="bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.790275 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-m4c8h" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.791575 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.792093 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.793958 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.795642 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.796843 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062" exitCode=0 Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.796874 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0" exitCode=0 Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.796885 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706" exitCode=0 Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.796894 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae" exitCode=2 Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.800086 4958 generic.go:334] "Generic (PLEG): container finished" podID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" containerID="122c33760731e76c7ebbb28c513f405061ecaa4dee7fcbb9f73f16085ddc4508" exitCode=0 Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.800153 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44","Type":"ContainerDied","Data":"122c33760731e76c7ebbb28c513f405061ecaa4dee7fcbb9f73f16085ddc4508"} Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.800359 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-875rt" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="registry-server" containerID="cri-o://d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14" gracePeriod=2 Mar 20 09:04:15 crc kubenswrapper[4958]: E0320 09:04:15.800941 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.65:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-875rt.189e8149796d1bef openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-875rt,UID:96818d4d-0c37-4c66-9f05-70d41cefa01d,APIVersion:v1,ResourceVersion:28522,FieldPath:spec.containers{registry-server},},Reason:Killing,Message:Stopping container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 09:04:15.800343535 +0000 UTC m=+276.122359493,LastTimestamp:2026-03-20 09:04:15.800343535 +0000 UTC m=+276.122359493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.801472 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.801863 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.802065 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.802303 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.802484 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.802695 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.802887 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.807426 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.808208 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.808498 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.808807 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.817019 4958 scope.go:117] "RemoveContainer" containerID="fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.841964 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.842743 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.843460 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.844701 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.845056 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.849533 4958 scope.go:117] "RemoveContainer" containerID="6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.944359 4958 scope.go:117] "RemoveContainer" containerID="bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18" Mar 20 09:04:15 crc kubenswrapper[4958]: E0320 09:04:15.945083 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18\": container with ID starting with bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18 not found: ID does not exist" containerID="bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.945151 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18"} err="failed to get container status \"bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18\": rpc error: code = NotFound desc = could not find container \"bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18\": container with ID starting with bc3a5fbca4925bf438f95cd16b92519750dd7572b0d55edbf23852a334e2cf18 not found: ID does not exist" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.945188 4958 scope.go:117] "RemoveContainer" containerID="fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb" Mar 20 09:04:15 crc kubenswrapper[4958]: E0320 09:04:15.946177 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb\": container with ID starting with fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb not found: ID does not exist" containerID="fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.946220 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb"} err="failed to get container status \"fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb\": rpc error: code = NotFound desc = could not find container \"fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb\": container with ID starting with fadbb992c0544e344d39e53c4996432ff66f2bae06f893d21d5edd67aa0329bb not found: ID does not exist" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.946251 4958 scope.go:117] "RemoveContainer" containerID="6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b" Mar 20 09:04:15 crc kubenswrapper[4958]: E0320 09:04:15.949511 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b\": container with ID starting with 6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b not found: ID does not exist" containerID="6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.949539 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b"} err="failed to get container status \"6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b\": rpc error: code = NotFound desc = could not find container \"6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b\": container with ID starting with 6b0773dec86faf871b7d8de080797047b952744c12d9f201609488e11fb9883b not found: ID does not exist" Mar 20 09:04:15 crc kubenswrapper[4958]: I0320 09:04:15.949556 4958 scope.go:117] "RemoveContainer" containerID="2796ce4bb945880f6e9fd2a5a651f0b3c3270a46e88ea7aa9e3b213f87a81263" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.224924 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.225665 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.226291 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.226964 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.227315 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.313031 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-catalog-content\") pod \"96818d4d-0c37-4c66-9f05-70d41cefa01d\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.313146 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-utilities\") pod \"96818d4d-0c37-4c66-9f05-70d41cefa01d\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.313233 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjdx2\" (UniqueName: \"kubernetes.io/projected/96818d4d-0c37-4c66-9f05-70d41cefa01d-kube-api-access-bjdx2\") pod \"96818d4d-0c37-4c66-9f05-70d41cefa01d\" (UID: \"96818d4d-0c37-4c66-9f05-70d41cefa01d\") " Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.314898 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-utilities" (OuterVolumeSpecName: "utilities") pod "96818d4d-0c37-4c66-9f05-70d41cefa01d" (UID: "96818d4d-0c37-4c66-9f05-70d41cefa01d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.320248 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96818d4d-0c37-4c66-9f05-70d41cefa01d-kube-api-access-bjdx2" (OuterVolumeSpecName: "kube-api-access-bjdx2") pod "96818d4d-0c37-4c66-9f05-70d41cefa01d" (UID: "96818d4d-0c37-4c66-9f05-70d41cefa01d"). InnerVolumeSpecName "kube-api-access-bjdx2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.370934 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "96818d4d-0c37-4c66-9f05-70d41cefa01d" (UID: "96818d4d-0c37-4c66-9f05-70d41cefa01d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.415471 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjdx2\" (UniqueName: \"kubernetes.io/projected/96818d4d-0c37-4c66-9f05-70d41cefa01d-kube-api-access-bjdx2\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.415523 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.415543 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/96818d4d-0c37-4c66-9f05-70d41cefa01d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.815419 4958 generic.go:334] "Generic (PLEG): container finished" podID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerID="d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14" exitCode=0 Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.815559 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerDied","Data":"d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14"} Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.815572 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-875rt" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.815648 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-875rt" event={"ID":"96818d4d-0c37-4c66-9f05-70d41cefa01d","Type":"ContainerDied","Data":"7b2af21a1ca020f4b44d22b2c9f0f100725c2c7cb29c7dd014e23f8d8582dc6f"} Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.815696 4958 scope.go:117] "RemoveContainer" containerID="d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.817223 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.817951 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.818590 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.819292 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.824300 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.824953 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.825014 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.825414 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.826137 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.907968 4958 scope.go:117] "RemoveContainer" containerID="c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c" Mar 20 09:04:16 crc kubenswrapper[4958]: I0320 09:04:16.980060 4958 scope.go:117] "RemoveContainer" containerID="1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.019590 4958 scope.go:117] "RemoveContainer" containerID="d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14" Mar 20 09:04:17 crc kubenswrapper[4958]: E0320 09:04:17.020276 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14\": container with ID starting with d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14 not found: ID does not exist" containerID="d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.020347 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14"} err="failed to get container status \"d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14\": rpc error: code = NotFound desc = could not find container \"d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14\": container with ID starting with d51e7ebebe7b3e0253bdd7ea493e16f6a9b94115702fa89b953d752e96dfbb14 not found: ID does not exist" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.020392 4958 scope.go:117] "RemoveContainer" containerID="c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c" Mar 20 09:04:17 crc kubenswrapper[4958]: E0320 09:04:17.021139 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c\": container with ID starting with c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c not found: ID does not exist" containerID="c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.021186 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c"} err="failed to get container status \"c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c\": rpc error: code = NotFound desc = could not find container \"c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c\": container with ID starting with c3d416697522d682835b53975efe8bc485f78400a9d91ab12c43335a0d25084c not found: ID does not exist" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.021204 4958 scope.go:117] "RemoveContainer" containerID="1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507" Mar 20 09:04:17 crc kubenswrapper[4958]: E0320 09:04:17.024464 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507\": container with ID starting with 1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507 not found: ID does not exist" containerID="1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.024536 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507"} err="failed to get container status \"1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507\": rpc error: code = NotFound desc = could not find container \"1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507\": container with ID starting with 1cfe39f5c3052536f4da2a63b03241963c9be112c3dfb21080b7dfb436b69507 not found: ID does not exist" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.303962 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.304876 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.305145 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.305385 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.305590 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430193 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-var-lock\") pod \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430271 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kube-api-access\") pod \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430352 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-var-lock" (OuterVolumeSpecName: "var-lock") pod "8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" (UID: "8b9f27e1-cd97-48d0-9abc-9bc4059f4b44"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430410 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kubelet-dir\") pod \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\" (UID: \"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44\") " Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430590 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" (UID: "8b9f27e1-cd97-48d0-9abc-9bc4059f4b44"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430924 4958 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.430944 4958 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.438845 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" (UID: "8b9f27e1-cd97-48d0-9abc-9bc4059f4b44"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.533293 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/8b9f27e1-cd97-48d0-9abc-9bc4059f4b44-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.632982 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.633850 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.634360 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.634705 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.635250 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.635547 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.635809 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.735791 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.735884 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.735956 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.735952 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.736072 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.736182 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.736512 4958 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.736532 4958 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.736546 4958 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.796194 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" containerName="oauth-openshift" containerID="cri-o://f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238" gracePeriod=15 Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.840564 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.842254 4958 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9" exitCode=0 Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.842410 4958 scope.go:117] "RemoveContainer" containerID="7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.842665 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.847272 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"8b9f27e1-cd97-48d0-9abc-9bc4059f4b44","Type":"ContainerDied","Data":"7d623d5132af76f9cfdc457531c7941fe359143c7f99efbe257da706b9795a54"} Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.847333 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d623d5132af76f9cfdc457531c7941fe359143c7f99efbe257da706b9795a54" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.847427 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.894160 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.894537 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.894932 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.895682 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.896109 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.896795 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.897971 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.898758 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.899403 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.899903 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.900423 4958 scope.go:117] "RemoveContainer" containerID="96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.955014 4958 scope.go:117] "RemoveContainer" containerID="33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.970507 4958 scope.go:117] "RemoveContainer" containerID="d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae" Mar 20 09:04:17 crc kubenswrapper[4958]: I0320 09:04:17.997045 4958 scope.go:117] "RemoveContainer" containerID="b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.022289 4958 scope.go:117] "RemoveContainer" containerID="9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.046730 4958 scope.go:117] "RemoveContainer" containerID="7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.047303 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\": container with ID starting with 7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062 not found: ID does not exist" containerID="7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.047358 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062"} err="failed to get container status \"7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\": rpc error: code = NotFound desc = could not find container \"7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062\": container with ID starting with 7f96e30d2dc5daaf833b9aab52508b3396d945b0a37bc7701c2c8da47ab47062 not found: ID does not exist" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.047399 4958 scope.go:117] "RemoveContainer" containerID="96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.048318 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\": container with ID starting with 96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0 not found: ID does not exist" containerID="96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.048354 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0"} err="failed to get container status \"96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\": rpc error: code = NotFound desc = could not find container \"96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0\": container with ID starting with 96a820e7b2e8cbdfa211338251eec5414baba1eabb4ea0ff75e6e255e245f9f0 not found: ID does not exist" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.048383 4958 scope.go:117] "RemoveContainer" containerID="33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.048939 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\": container with ID starting with 33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706 not found: ID does not exist" containerID="33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.048986 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706"} err="failed to get container status \"33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\": rpc error: code = NotFound desc = could not find container \"33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706\": container with ID starting with 33c4f944a061d2e00f3a14167fc188c0f8743518824260666aa9d753b2405706 not found: ID does not exist" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.049039 4958 scope.go:117] "RemoveContainer" containerID="d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.049319 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\": container with ID starting with d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae not found: ID does not exist" containerID="d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.049351 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae"} err="failed to get container status \"d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\": rpc error: code = NotFound desc = could not find container \"d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae\": container with ID starting with d4f0386602fae0384395bd9ba57b7cce43875e182dcd13ff0449c0059c54d8ae not found: ID does not exist" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.049366 4958 scope.go:117] "RemoveContainer" containerID="b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.049567 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\": container with ID starting with b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9 not found: ID does not exist" containerID="b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.049587 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9"} err="failed to get container status \"b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\": rpc error: code = NotFound desc = could not find container \"b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9\": container with ID starting with b7777dd29552e17b3a0a1c5be711ac6882cecd9cbad42bdfabc8d2f394bbeef9 not found: ID does not exist" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.049611 4958 scope.go:117] "RemoveContainer" containerID="9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.050790 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\": container with ID starting with 9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e not found: ID does not exist" containerID="9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.050868 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e"} err="failed to get container status \"9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\": rpc error: code = NotFound desc = could not find container \"9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e\": container with ID starting with 9151aed5721934354f9f85ab3abe1940b57f9fbc091efa5c25ab2c159613c20e not found: ID does not exist" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.325945 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.326535 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.327171 4958 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.327798 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.328036 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.328307 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.328811 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449623 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-policies\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449675 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-login\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449745 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bhlqk\" (UniqueName: \"kubernetes.io/projected/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-kube-api-access-bhlqk\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449765 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-router-certs\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449784 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-idp-0-file-data\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449814 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-session\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449836 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-serving-cert\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449874 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-cliconfig\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449904 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-dir\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449938 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-service-ca\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449955 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-ocp-branding-template\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.449984 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-provider-selection\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.450004 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-error\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.450042 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-trusted-ca-bundle\") pod \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\" (UID: \"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e\") " Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.450708 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.450807 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.451179 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.451407 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.451862 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.457824 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.457995 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-kube-api-access-bhlqk" (OuterVolumeSpecName: "kube-api-access-bhlqk") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "kube-api-access-bhlqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.458043 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.458157 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.458703 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.459282 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.459355 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.459548 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.460010 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.460734 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" (UID: "f3ec3613-8ec0-457b-b1d0-3c17a30bae2e"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552229 4958 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552271 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bhlqk\" (UniqueName: \"kubernetes.io/projected/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-kube-api-access-bhlqk\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552285 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552724 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552743 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552755 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552767 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552781 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552792 4958 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552802 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552815 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552828 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552839 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.552851 4958 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.859785 4958 generic.go:334] "Generic (PLEG): container finished" podID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" containerID="f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238" exitCode=0 Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.859850 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" event={"ID":"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e","Type":"ContainerDied","Data":"f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238"} Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.859905 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" event={"ID":"f3ec3613-8ec0-457b-b1d0-3c17a30bae2e","Type":"ContainerDied","Data":"f3bbccf610e430818a627a78b7f394ad16c2c315cb78e6cf618e29d64caaed1d"} Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.859946 4958 scope.go:117] "RemoveContainer" containerID="f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.860517 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.861976 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.862749 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.863654 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.864186 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.864697 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.867918 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.868348 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.868878 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.869258 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.869664 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.886128 4958 scope.go:117] "RemoveContainer" containerID="f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238" Mar 20 09:04:18 crc kubenswrapper[4958]: E0320 09:04:18.886899 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238\": container with ID starting with f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238 not found: ID does not exist" containerID="f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238" Mar 20 09:04:18 crc kubenswrapper[4958]: I0320 09:04:18.886985 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238"} err="failed to get container status \"f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238\": rpc error: code = NotFound desc = could not find container \"f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238\": container with ID starting with f37f0f8650b06af50f534c978be3aa4822e8172d94c2d8adf9a1f98746e55238 not found: ID does not exist" Mar 20 09:04:19 crc kubenswrapper[4958]: E0320 09:04:19.825380 4958 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:19 crc kubenswrapper[4958]: I0320 09:04:19.827079 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.442008 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.442751 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.443245 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.443805 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.444343 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.878817 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"07f4cd87d7d7c556da4db9a450ec5263ddd1b8bc876fefdd7167f6bf82eec49d"} Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.878884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"8f96748db3789efb0a3e8c1ef9da6dff558ff9e63cabd4a1715111caa1ea7a03"} Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.879574 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: E0320 09:04:20.879741 4958 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.880198 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.880518 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.881466 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:20 crc kubenswrapper[4958]: I0320 09:04:20.882408 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.111635 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.111907 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.112087 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.112257 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.112428 4958 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:22 crc kubenswrapper[4958]: I0320 09:04:22.112459 4958 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.112645 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="200ms" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.314464 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="400ms" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.716409 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="800ms" Mar 20 09:04:22 crc kubenswrapper[4958]: E0320 09:04:22.814891 4958 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/events\": dial tcp 38.129.56.65:6443: connect: connection refused" event="&Event{ObjectMeta:{certified-operators-875rt.189e8149796d1bef openshift-marketplace 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-marketplace,Name:certified-operators-875rt,UID:96818d4d-0c37-4c66-9f05-70d41cefa01d,APIVersion:v1,ResourceVersion:28522,FieldPath:spec.containers{registry-server},},Reason:Killing,Message:Stopping container registry-server,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 09:04:15.800343535 +0000 UTC m=+276.122359493,LastTimestamp:2026-03-20 09:04:15.800343535 +0000 UTC m=+276.122359493,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 09:04:23 crc kubenswrapper[4958]: E0320 09:04:23.517432 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="1.6s" Mar 20 09:04:25 crc kubenswrapper[4958]: E0320 09:04:25.118201 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="3.2s" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.522087 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.522483 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.522556 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.523381 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.523473 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711" gracePeriod=600 Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.929365 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711" exitCode=0 Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.929555 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711"} Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.929685 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"58ed31675b41c4d2716ac9083f69cda61ce6ef10102045cfe1b828ff5cb4d12f"} Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.930262 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.930449 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.930695 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.930907 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.931062 4958 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-kvsdf\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:26 crc kubenswrapper[4958]: I0320 09:04:26.931200 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.434268 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.435717 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.436409 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.437029 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.437405 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.441815 4958 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-kvsdf\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.443886 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.457526 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.457559 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:27 crc kubenswrapper[4958]: E0320 09:04:27.458235 4958 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.459273 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:27 crc kubenswrapper[4958]: W0320 09:04:27.478260 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-80522eb8418198ba774f1fd264fec9540d27595a6058b364233a1317f93ecb81 WatchSource:0}: Error finding container 80522eb8418198ba774f1fd264fec9540d27595a6058b364233a1317f93ecb81: Status 404 returned error can't find the container with id 80522eb8418198ba774f1fd264fec9540d27595a6058b364233a1317f93ecb81 Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.937931 4958 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="02ad06f76c865c4b68fa2f03feef81900777aa8b908d5aab7a18dad5c9fda580" exitCode=0 Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.938013 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"02ad06f76c865c4b68fa2f03feef81900777aa8b908d5aab7a18dad5c9fda580"} Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.938382 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"80522eb8418198ba774f1fd264fec9540d27595a6058b364233a1317f93ecb81"} Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.938830 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.938854 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:27 crc kubenswrapper[4958]: E0320 09:04:27.939343 4958 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.939357 4958 status_manager.go:851] "Failed to get status for pod" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/pods/machine-config-daemon-kvsdf\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.939981 4958 status_manager.go:851] "Failed to get status for pod" podUID="faa90514-f83a-442b-9d17-08ff904728f2" pod="openshift-marketplace/redhat-operators-smdkg" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/redhat-operators-smdkg\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.940316 4958 status_manager.go:851] "Failed to get status for pod" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.940635 4958 status_manager.go:851] "Failed to get status for pod" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" pod="openshift-authentication/oauth-openshift-558db77b4-6j2mb" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-6j2mb\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.940931 4958 status_manager.go:851] "Failed to get status for pod" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" pod="openshift-marketplace/certified-operators-875rt" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/certified-operators-875rt\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:27 crc kubenswrapper[4958]: I0320 09:04:27.941331 4958 status_manager.go:851] "Failed to get status for pod" podUID="d551e28f-f3d1-4135-bc78-f606120df286" pod="openshift-marketplace/community-operators-m4c8h" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-marketplace/pods/community-operators-m4c8h\": dial tcp 38.129.56.65:6443: connect: connection refused" Mar 20 09:04:28 crc kubenswrapper[4958]: E0320 09:04:28.320270 4958 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.65:6443: connect: connection refused" interval="6.4s" Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.947268 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.953234 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.953322 4958 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="485288d9b577950a20ea275f1289685b34ff9cf6debe3c6ddc1170b70ff8ef88" exitCode=1 Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.953505 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"485288d9b577950a20ea275f1289685b34ff9cf6debe3c6ddc1170b70ff8ef88"} Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.954226 4958 scope.go:117] "RemoveContainer" containerID="485288d9b577950a20ea275f1289685b34ff9cf6debe3c6ddc1170b70ff8ef88" Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.961608 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4a51cfdc6f199725457234d9ed4ea0a91b6f73a4afd4bd40f1efa511cbae4e0f"} Mar 20 09:04:28 crc kubenswrapper[4958]: I0320 09:04:28.961674 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4dd92dc6ffe77d4fe5388d569df9a30d59d5327cb2dda228c83fc81dbd1ed32c"} Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.971371 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"fe4214d977d4cf2002b5ff6ba814a030a0545a8627e1acb959a49a83dd17d355"} Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.971440 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"d0715adc2dad36c97b33a2d4a4696118bfa21f927c5550bb85e13d36894827ad"} Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.971455 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9bb981cef7328362e3a9a5f197a5bda825b6acb058306883f1dcd33768e5aeae"} Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.971574 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.971814 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.971853 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.974790 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.976243 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 09:04:29 crc kubenswrapper[4958]: I0320 09:04:29.976339 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"9c1368f17fa4cdc5a5b2e9e78ec41b893dc05f99bdfcfe3c13bd86c2e14d9895"} Mar 20 09:04:32 crc kubenswrapper[4958]: I0320 09:04:32.246848 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:04:32 crc kubenswrapper[4958]: I0320 09:04:32.250743 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:04:32 crc kubenswrapper[4958]: I0320 09:04:32.459803 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:32 crc kubenswrapper[4958]: I0320 09:04:32.459864 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:32 crc kubenswrapper[4958]: I0320 09:04:32.465285 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:32 crc kubenswrapper[4958]: I0320 09:04:32.994088 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:04:35 crc kubenswrapper[4958]: I0320 09:04:35.052702 4958 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:35 crc kubenswrapper[4958]: I0320 09:04:35.278095 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c9c3d73c-2309-421d-87f8-d4b6a8cd0344" Mar 20 09:04:36 crc kubenswrapper[4958]: I0320 09:04:36.014271 4958 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:36 crc kubenswrapper[4958]: I0320 09:04:36.014653 4958 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="382f857a-419b-4239-98bd-5f96a093f2cd" Mar 20 09:04:36 crc kubenswrapper[4958]: I0320 09:04:36.018653 4958 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="c9c3d73c-2309-421d-87f8-d4b6a8cd0344" Mar 20 09:04:44 crc kubenswrapper[4958]: I0320 09:04:44.559353 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 09:04:45 crc kubenswrapper[4958]: I0320 09:04:45.754355 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 09:04:45 crc kubenswrapper[4958]: I0320 09:04:45.776506 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 09:04:45 crc kubenswrapper[4958]: I0320 09:04:45.975726 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 09:04:46 crc kubenswrapper[4958]: I0320 09:04:46.128910 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 09:04:46 crc kubenswrapper[4958]: I0320 09:04:46.140587 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 09:04:46 crc kubenswrapper[4958]: I0320 09:04:46.394577 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 09:04:46 crc kubenswrapper[4958]: I0320 09:04:46.605768 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 09:04:46 crc kubenswrapper[4958]: I0320 09:04:46.685351 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 09:04:46 crc kubenswrapper[4958]: I0320 09:04:46.832625 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.050732 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.122995 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.380127 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.389421 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.454753 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.526204 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.590477 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 09:04:47 crc kubenswrapper[4958]: I0320 09:04:47.701922 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.172094 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.299747 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.363081 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.452990 4958 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.470265 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.539520 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.573474 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.654304 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.677501 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 09:04:48 crc kubenswrapper[4958]: I0320 09:04:48.848742 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.056410 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.093203 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.111258 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.128543 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.203732 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.216412 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.275506 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.437068 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.477962 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.537638 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.745830 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.763565 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.939479 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.978174 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 09:04:49 crc kubenswrapper[4958]: I0320 09:04:49.997363 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.031923 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.103124 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.131653 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.133961 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.152221 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.180460 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.241222 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.279267 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.511683 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.560664 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.671222 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.693745 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.703273 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.726176 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.726813 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.816160 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 09:04:50 crc kubenswrapper[4958]: I0320 09:04:50.837952 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.024205 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.032623 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.076887 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.133329 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.142752 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.155261 4958 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.278580 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.283722 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.313102 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.460947 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.470270 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.569757 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.657764 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.720777 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.769232 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.794007 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.839626 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 09:04:51 crc kubenswrapper[4958]: I0320 09:04:51.954937 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.010366 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.118393 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.151885 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.236716 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.350760 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.384120 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.423985 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.435637 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.459160 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.644273 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.721635 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.752903 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.814801 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.817210 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.899448 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 09:04:52 crc kubenswrapper[4958]: I0320 09:04:52.904396 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.035750 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.170952 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.207447 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.223656 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.310620 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.317540 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.326478 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.327170 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.339472 4958 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.343702 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.439061 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.522493 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.550069 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.556705 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.572401 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.580417 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.593282 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.648101 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.695069 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.696711 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.766115 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 09:04:53 crc kubenswrapper[4958]: I0320 09:04:53.952008 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.055283 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.079018 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.088644 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.122961 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.143826 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.169921 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.188399 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.256699 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.346626 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.352727 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.373064 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.406135 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.467425 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.559933 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.602293 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.718659 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.805321 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.870497 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 09:04:54 crc kubenswrapper[4958]: I0320 09:04:54.953044 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.066058 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.071138 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.079368 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.097497 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.168086 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.190670 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.293942 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.306936 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.406184 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.529388 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.584082 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.596651 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.621297 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.650158 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.684670 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.761229 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 09:04:55 crc kubenswrapper[4958]: I0320 09:04:55.888232 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.086356 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.110110 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.124615 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.339263 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.368532 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.442386 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.473621 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.536033 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.539027 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.542420 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.582842 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.611012 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.727211 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.755103 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.785295 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.815636 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.825536 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.882630 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.925071 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.927790 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.949342 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.969861 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.986356 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 09:04:56 crc kubenswrapper[4958]: I0320 09:04:56.992143 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.004352 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.066814 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.189393 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.292442 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.342955 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.365672 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.402273 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.539216 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.541086 4958 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.547330 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-6j2mb","openshift-marketplace/community-operators-m4c8h","openshift-marketplace/certified-operators-875rt"] Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.547430 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.547456 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl","openshift-controller-manager/controller-manager-55684fd5db-k5x7k"] Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.547737 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" podUID="a0ca83ae-d916-4c7e-8887-fc12170212fd" containerName="controller-manager" containerID="cri-o://53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6" gracePeriod=30 Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.547988 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" podUID="2f790d13-f747-4a01-9f2b-87d60076c10d" containerName="route-controller-manager" containerID="cri-o://13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd" gracePeriod=30 Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.555956 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.556305 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.558938 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.561837 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.571839 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.571814801 podStartE2EDuration="22.571814801s" podCreationTimestamp="2026-03-20 09:04:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:04:57.571016966 +0000 UTC m=+317.893032934" watchObservedRunningTime="2026-03-20 09:04:57.571814801 +0000 UTC m=+317.893830759" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.576587 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.638773 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.648470 4958 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.715896 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.823428 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.871867 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 09:04:57 crc kubenswrapper[4958]: I0320 09:04:57.935221 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.043035 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.061059 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.062051 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097004 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7"] Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097259 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="registry-server" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097271 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="registry-server" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097284 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="extract-content" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097290 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="extract-content" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097301 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="extract-utilities" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097308 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="extract-utilities" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097316 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="extract-content" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097322 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="extract-content" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097331 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f790d13-f747-4a01-9f2b-87d60076c10d" containerName="route-controller-manager" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097337 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f790d13-f747-4a01-9f2b-87d60076c10d" containerName="route-controller-manager" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097343 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0ca83ae-d916-4c7e-8887-fc12170212fd" containerName="controller-manager" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097349 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0ca83ae-d916-4c7e-8887-fc12170212fd" containerName="controller-manager" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097357 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" containerName="oauth-openshift" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097362 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" containerName="oauth-openshift" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097372 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" containerName="installer" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097378 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" containerName="installer" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097386 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="extract-utilities" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097393 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="extract-utilities" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.097401 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="registry-server" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097406 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="registry-server" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097499 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0ca83ae-d916-4c7e-8887-fc12170212fd" containerName="controller-manager" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097511 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" containerName="registry-server" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097519 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f790d13-f747-4a01-9f2b-87d60076c10d" containerName="route-controller-manager" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097527 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d551e28f-f3d1-4135-bc78-f606120df286" containerName="registry-server" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097534 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b9f27e1-cd97-48d0-9abc-9bc4059f4b44" containerName="installer" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097542 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" containerName="oauth-openshift" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.097921 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.107541 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7"] Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.129724 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141193 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xdcdv\" (UniqueName: \"kubernetes.io/projected/2f790d13-f747-4a01-9f2b-87d60076c10d-kube-api-access-xdcdv\") pod \"2f790d13-f747-4a01-9f2b-87d60076c10d\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141246 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-client-ca\") pod \"a0ca83ae-d916-4c7e-8887-fc12170212fd\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141269 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f9hdv\" (UniqueName: \"kubernetes.io/projected/a0ca83ae-d916-4c7e-8887-fc12170212fd-kube-api-access-f9hdv\") pod \"a0ca83ae-d916-4c7e-8887-fc12170212fd\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141293 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ca83ae-d916-4c7e-8887-fc12170212fd-serving-cert\") pod \"a0ca83ae-d916-4c7e-8887-fc12170212fd\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141320 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f790d13-f747-4a01-9f2b-87d60076c10d-serving-cert\") pod \"2f790d13-f747-4a01-9f2b-87d60076c10d\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-config\") pod \"a0ca83ae-d916-4c7e-8887-fc12170212fd\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141392 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-client-ca\") pod \"2f790d13-f747-4a01-9f2b-87d60076c10d\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141412 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-config\") pod \"2f790d13-f747-4a01-9f2b-87d60076c10d\" (UID: \"2f790d13-f747-4a01-9f2b-87d60076c10d\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.141458 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-proxy-ca-bundles\") pod \"a0ca83ae-d916-4c7e-8887-fc12170212fd\" (UID: \"a0ca83ae-d916-4c7e-8887-fc12170212fd\") " Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.143092 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-client-ca" (OuterVolumeSpecName: "client-ca") pod "a0ca83ae-d916-4c7e-8887-fc12170212fd" (UID: "a0ca83ae-d916-4c7e-8887-fc12170212fd"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.143108 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-config" (OuterVolumeSpecName: "config") pod "a0ca83ae-d916-4c7e-8887-fc12170212fd" (UID: "a0ca83ae-d916-4c7e-8887-fc12170212fd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.143959 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a0ca83ae-d916-4c7e-8887-fc12170212fd" (UID: "a0ca83ae-d916-4c7e-8887-fc12170212fd"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.144484 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-client-ca" (OuterVolumeSpecName: "client-ca") pod "2f790d13-f747-4a01-9f2b-87d60076c10d" (UID: "2f790d13-f747-4a01-9f2b-87d60076c10d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.144710 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.144959 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-config" (OuterVolumeSpecName: "config") pod "2f790d13-f747-4a01-9f2b-87d60076c10d" (UID: "2f790d13-f747-4a01-9f2b-87d60076c10d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.157784 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0ca83ae-d916-4c7e-8887-fc12170212fd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a0ca83ae-d916-4c7e-8887-fc12170212fd" (UID: "a0ca83ae-d916-4c7e-8887-fc12170212fd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.157903 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f790d13-f747-4a01-9f2b-87d60076c10d-kube-api-access-xdcdv" (OuterVolumeSpecName: "kube-api-access-xdcdv") pod "2f790d13-f747-4a01-9f2b-87d60076c10d" (UID: "2f790d13-f747-4a01-9f2b-87d60076c10d"). InnerVolumeSpecName "kube-api-access-xdcdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.157985 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0ca83ae-d916-4c7e-8887-fc12170212fd-kube-api-access-f9hdv" (OuterVolumeSpecName: "kube-api-access-f9hdv") pod "a0ca83ae-d916-4c7e-8887-fc12170212fd" (UID: "a0ca83ae-d916-4c7e-8887-fc12170212fd"). InnerVolumeSpecName "kube-api-access-f9hdv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.157938 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2f790d13-f747-4a01-9f2b-87d60076c10d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "2f790d13-f747-4a01-9f2b-87d60076c10d" (UID: "2f790d13-f747-4a01-9f2b-87d60076c10d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.159844 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f790d13-f747-4a01-9f2b-87d60076c10d" containerID="13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd" exitCode=0 Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.159897 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" event={"ID":"2f790d13-f747-4a01-9f2b-87d60076c10d","Type":"ContainerDied","Data":"13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd"} Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.159951 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.159991 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl" event={"ID":"2f790d13-f747-4a01-9f2b-87d60076c10d","Type":"ContainerDied","Data":"7baa28af1d25f1b40ecaa4c4ce8e6bafded1ac65c70fb2d8bcdb0137e52b3252"} Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.160024 4958 scope.go:117] "RemoveContainer" containerID="13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.163200 4958 generic.go:334] "Generic (PLEG): container finished" podID="a0ca83ae-d916-4c7e-8887-fc12170212fd" containerID="53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6" exitCode=0 Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.163750 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.163759 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" event={"ID":"a0ca83ae-d916-4c7e-8887-fc12170212fd","Type":"ContainerDied","Data":"53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6"} Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.163841 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-55684fd5db-k5x7k" event={"ID":"a0ca83ae-d916-4c7e-8887-fc12170212fd","Type":"ContainerDied","Data":"be28575f79f07930dd031481c31b867ff897590ff2fbec5fe04bbfd21e8ceec3"} Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.179014 4958 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.191748 4958 scope.go:117] "RemoveContainer" containerID="13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.192423 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd\": container with ID starting with 13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd not found: ID does not exist" containerID="13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.192508 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd"} err="failed to get container status \"13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd\": rpc error: code = NotFound desc = could not find container \"13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd\": container with ID starting with 13eb935e75bb30ee1ee289e5d42209f992f9e2d9597074bf90097d7e6f00aecd not found: ID does not exist" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.192538 4958 scope.go:117] "RemoveContainer" containerID="53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.204752 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-55684fd5db-k5x7k"] Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.211723 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-55684fd5db-k5x7k"] Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.215034 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl"] Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.218007 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-fdb855975-5dnbl"] Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.218358 4958 scope.go:117] "RemoveContainer" containerID="53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6" Mar 20 09:04:58 crc kubenswrapper[4958]: E0320 09:04:58.218858 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6\": container with ID starting with 53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6 not found: ID does not exist" containerID="53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.218888 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6"} err="failed to get container status \"53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6\": rpc error: code = NotFound desc = could not find container \"53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6\": container with ID starting with 53254508f249012ed8cc0536d2c5342e03cbdc01e09223b9a2b4df32fa36b4f6 not found: ID does not exist" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.242525 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-client-ca\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.242618 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-config\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.242755 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33144c81-81c2-46b8-bf4c-234aa6c61ce5-serving-cert\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.242821 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjrnt\" (UniqueName: \"kubernetes.io/projected/33144c81-81c2-46b8-bf4c-234aa6c61ce5-kube-api-access-zjrnt\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243069 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243109 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xdcdv\" (UniqueName: \"kubernetes.io/projected/2f790d13-f747-4a01-9f2b-87d60076c10d-kube-api-access-xdcdv\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243131 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243146 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f9hdv\" (UniqueName: \"kubernetes.io/projected/a0ca83ae-d916-4c7e-8887-fc12170212fd-kube-api-access-f9hdv\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243158 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a0ca83ae-d916-4c7e-8887-fc12170212fd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243173 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2f790d13-f747-4a01-9f2b-87d60076c10d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243186 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0ca83ae-d916-4c7e-8887-fc12170212fd-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243198 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.243208 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2f790d13-f747-4a01-9f2b-87d60076c10d-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.262049 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.341180 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.344959 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-client-ca\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.345118 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-config\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.346078 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-client-ca\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.346455 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33144c81-81c2-46b8-bf4c-234aa6c61ce5-serving-cert\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.346657 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjrnt\" (UniqueName: \"kubernetes.io/projected/33144c81-81c2-46b8-bf4c-234aa6c61ce5-kube-api-access-zjrnt\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.347084 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-config\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.351437 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33144c81-81c2-46b8-bf4c-234aa6c61ce5-serving-cert\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.366196 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjrnt\" (UniqueName: \"kubernetes.io/projected/33144c81-81c2-46b8-bf4c-234aa6c61ce5-kube-api-access-zjrnt\") pod \"route-controller-manager-55f4f49fdf-zwll7\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.422437 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.445544 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f790d13-f747-4a01-9f2b-87d60076c10d" path="/var/lib/kubelet/pods/2f790d13-f747-4a01-9f2b-87d60076c10d/volumes" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.448589 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96818d4d-0c37-4c66-9f05-70d41cefa01d" path="/var/lib/kubelet/pods/96818d4d-0c37-4c66-9f05-70d41cefa01d/volumes" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.451285 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0ca83ae-d916-4c7e-8887-fc12170212fd" path="/var/lib/kubelet/pods/a0ca83ae-d916-4c7e-8887-fc12170212fd/volumes" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.453133 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d551e28f-f3d1-4135-bc78-f606120df286" path="/var/lib/kubelet/pods/d551e28f-f3d1-4135-bc78-f606120df286/volumes" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.454679 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f3ec3613-8ec0-457b-b1d0-3c17a30bae2e" path="/var/lib/kubelet/pods/f3ec3613-8ec0-457b-b1d0-3c17a30bae2e/volumes" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.506750 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.533812 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.543646 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.552314 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.581347 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.604876 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.664456 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.718689 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.782505 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.833216 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.894102 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7"] Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.906134 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.991296 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 09:04:58 crc kubenswrapper[4958]: I0320 09:04:58.996352 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.161499 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.172517 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" event={"ID":"33144c81-81c2-46b8-bf4c-234aa6c61ce5","Type":"ContainerStarted","Data":"4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286"} Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.172580 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" event={"ID":"33144c81-81c2-46b8-bf4c-234aa6c61ce5","Type":"ContainerStarted","Data":"8d5783a9a9ffc015b4d619ad4b60adb9e00339bf3d91be82fcb4c5670bce7dff"} Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.195366 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" podStartSLOduration=3.195345111 podStartE2EDuration="3.195345111s" podCreationTimestamp="2026-03-20 09:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:04:59.195240588 +0000 UTC m=+319.517256606" watchObservedRunningTime="2026-03-20 09:04:59.195345111 +0000 UTC m=+319.517361079" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.315940 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.370010 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.431369 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.456102 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.536690 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.560851 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.730337 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 09:04:59 crc kubenswrapper[4958]: I0320 09:04:59.859583 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.063980 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.103774 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.142617 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.162558 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-75db9"] Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.163296 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.167411 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.171640 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.171707 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.172866 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.173035 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.179979 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.180197 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.186064 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.190036 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.190305 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-75db9"] Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.209545 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.282990 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2cf5e-082e-49d8-b25c-8d35faa7e529-serving-cert\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.283041 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-proxy-ca-bundles\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.283074 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-config\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.283132 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shxc5\" (UniqueName: \"kubernetes.io/projected/afd2cf5e-082e-49d8-b25c-8d35faa7e529-kube-api-access-shxc5\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.283460 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-client-ca\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.385323 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2cf5e-082e-49d8-b25c-8d35faa7e529-serving-cert\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.387260 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-proxy-ca-bundles\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.388418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-proxy-ca-bundles\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.388472 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-config\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.389749 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-config\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.389841 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-shxc5\" (UniqueName: \"kubernetes.io/projected/afd2cf5e-082e-49d8-b25c-8d35faa7e529-kube-api-access-shxc5\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.390340 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-client-ca\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.391320 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2cf5e-082e-49d8-b25c-8d35faa7e529-serving-cert\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.392420 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-client-ca\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.412226 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-shxc5\" (UniqueName: \"kubernetes.io/projected/afd2cf5e-082e-49d8-b25c-8d35faa7e529-kube-api-access-shxc5\") pod \"controller-manager-85dcd97b9b-75db9\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.480047 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.586376 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.597771 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.674697 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 09:05:00 crc kubenswrapper[4958]: I0320 09:05:00.914103 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-75db9"] Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.024829 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.162856 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b"] Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.163997 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.166648 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.166762 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167071 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167160 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167167 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167183 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167071 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167499 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.167565 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.174513 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.175114 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.175233 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.181949 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.186507 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.186812 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" event={"ID":"afd2cf5e-082e-49d8-b25c-8d35faa7e529","Type":"ContainerStarted","Data":"f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156"} Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.187355 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" event={"ID":"afd2cf5e-082e-49d8-b25c-8d35faa7e529","Type":"ContainerStarted","Data":"1592fd068fc465109eeb2196754478e2a4489f157492c9fbdae6de896b5912dd"} Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.187411 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.191884 4958 patch_prober.go:28] interesting pod/controller-manager-85dcd97b9b-75db9 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" start-of-body= Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.191988 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" podUID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.66:8443/healthz\": dial tcp 10.217.0.66:8443: connect: connection refused" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.192770 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.193699 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b"] Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.197774 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.237081 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" podStartSLOduration=5.23706335 podStartE2EDuration="5.23706335s" podCreationTimestamp="2026-03-20 09:04:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:01.234012287 +0000 UTC m=+321.556028245" watchObservedRunningTime="2026-03-20 09:05:01.23706335 +0000 UTC m=+321.559079298" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.275710 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.305825 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-audit-policies\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.305894 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.305932 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.305959 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-login\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.305992 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306017 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5lsw\" (UniqueName: \"kubernetes.io/projected/2a1fb47f-6274-47f0-9b0c-360e32b43f53-kube-api-access-w5lsw\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306060 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-router-certs\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306107 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-error\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306146 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306219 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-session\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306289 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-service-ca\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306334 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306425 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a1fb47f-6274-47f0-9b0c-360e32b43f53-audit-dir\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.306452 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.407778 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-session\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408121 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-service-ca\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408252 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a1fb47f-6274-47f0-9b0c-360e32b43f53-audit-dir\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408526 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408659 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-audit-policies\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408762 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408886 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408993 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-login\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.409140 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.409267 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w5lsw\" (UniqueName: \"kubernetes.io/projected/2a1fb47f-6274-47f0-9b0c-360e32b43f53-kube-api-access-w5lsw\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.409371 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-router-certs\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.409460 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-error\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.410969 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408822 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-service-ca\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.408577 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2a1fb47f-6274-47f0-9b0c-360e32b43f53-audit-dir\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.409583 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-audit-policies\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.411314 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-cliconfig\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.414383 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-session\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.416234 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.417003 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-serving-cert\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.417384 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.417412 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-login\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.418447 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-template-error\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.419268 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.422000 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-router-certs\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.422704 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2a1fb47f-6274-47f0-9b0c-360e32b43f53-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.434275 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5lsw\" (UniqueName: \"kubernetes.io/projected/2a1fb47f-6274-47f0-9b0c-360e32b43f53-kube-api-access-w5lsw\") pod \"oauth-openshift-7987bb8c7b-5jc9b\" (UID: \"2a1fb47f-6274-47f0-9b0c-360e32b43f53\") " pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.466171 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.484220 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.690618 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b"] Mar 20 09:05:01 crc kubenswrapper[4958]: W0320 09:05:01.695267 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a1fb47f_6274_47f0_9b0c_360e32b43f53.slice/crio-5d4a49131040f0e2c3c804df526976957ca00476751fe4fd969c39716ac0ab89 WatchSource:0}: Error finding container 5d4a49131040f0e2c3c804df526976957ca00476751fe4fd969c39716ac0ab89: Status 404 returned error can't find the container with id 5d4a49131040f0e2c3c804df526976957ca00476751fe4fd969c39716ac0ab89 Mar 20 09:05:01 crc kubenswrapper[4958]: I0320 09:05:01.791368 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.033199 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.197046 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" event={"ID":"2a1fb47f-6274-47f0-9b0c-360e32b43f53","Type":"ContainerStarted","Data":"81a330b5c0249d65727b5903211f9b44a98eec7e02ffcd54d39fbd8d3ef51633"} Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.197101 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" event={"ID":"2a1fb47f-6274-47f0-9b0c-360e32b43f53","Type":"ContainerStarted","Data":"5d4a49131040f0e2c3c804df526976957ca00476751fe4fd969c39716ac0ab89"} Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.199255 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.203877 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.221497 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" podStartSLOduration=70.221467432 podStartE2EDuration="1m10.221467432s" podCreationTimestamp="2026-03-20 09:03:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:02.215925654 +0000 UTC m=+322.537941632" watchObservedRunningTime="2026-03-20 09:05:02.221467432 +0000 UTC m=+322.543483390" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.235166 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.235269 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-7987bb8c7b-5jc9b" Mar 20 09:05:02 crc kubenswrapper[4958]: I0320 09:05:02.672133 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 09:05:03 crc kubenswrapper[4958]: I0320 09:05:03.190563 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 09:05:07 crc kubenswrapper[4958]: I0320 09:05:07.973967 4958 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 09:05:07 crc kubenswrapper[4958]: I0320 09:05:07.974731 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://07f4cd87d7d7c556da4db9a450ec5263ddd1b8bc876fefdd7167f6bf82eec49d" gracePeriod=5 Mar 20 09:05:13 crc kubenswrapper[4958]: I0320 09:05:13.274502 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 09:05:13 crc kubenswrapper[4958]: I0320 09:05:13.274574 4958 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="07f4cd87d7d7c556da4db9a450ec5263ddd1b8bc876fefdd7167f6bf82eec49d" exitCode=137 Mar 20 09:05:13 crc kubenswrapper[4958]: I0320 09:05:13.964386 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 09:05:13 crc kubenswrapper[4958]: I0320 09:05:13.965075 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089291 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089486 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089517 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089516 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089559 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089638 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089735 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089760 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.089878 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.090496 4958 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.090525 4958 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.090534 4958 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.090544 4958 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.101210 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.191898 4958 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.289363 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.289487 4958 scope.go:117] "RemoveContainer" containerID="07f4cd87d7d7c556da4db9a450ec5263ddd1b8bc876fefdd7167f6bf82eec49d" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.289750 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 09:05:14 crc kubenswrapper[4958]: I0320 09:05:14.444779 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 09:05:16 crc kubenswrapper[4958]: I0320 09:05:16.547633 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-75db9"] Mar 20 09:05:16 crc kubenswrapper[4958]: I0320 09:05:16.548095 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" podUID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" containerName="controller-manager" containerID="cri-o://f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156" gracePeriod=30 Mar 20 09:05:16 crc kubenswrapper[4958]: I0320 09:05:16.570386 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7"] Mar 20 09:05:16 crc kubenswrapper[4958]: I0320 09:05:16.571851 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" podUID="33144c81-81c2-46b8-bf4c-234aa6c61ce5" containerName="route-controller-manager" containerID="cri-o://4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286" gracePeriod=30 Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.127877 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.139638 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-config\") pod \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.139714 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-client-ca\") pod \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.139840 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33144c81-81c2-46b8-bf4c-234aa6c61ce5-serving-cert\") pod \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.139886 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zjrnt\" (UniqueName: \"kubernetes.io/projected/33144c81-81c2-46b8-bf4c-234aa6c61ce5-kube-api-access-zjrnt\") pod \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\" (UID: \"33144c81-81c2-46b8-bf4c-234aa6c61ce5\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.141011 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-client-ca" (OuterVolumeSpecName: "client-ca") pod "33144c81-81c2-46b8-bf4c-234aa6c61ce5" (UID: "33144c81-81c2-46b8-bf4c-234aa6c61ce5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.142482 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-config" (OuterVolumeSpecName: "config") pod "33144c81-81c2-46b8-bf4c-234aa6c61ce5" (UID: "33144c81-81c2-46b8-bf4c-234aa6c61ce5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.150783 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/33144c81-81c2-46b8-bf4c-234aa6c61ce5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "33144c81-81c2-46b8-bf4c-234aa6c61ce5" (UID: "33144c81-81c2-46b8-bf4c-234aa6c61ce5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.150806 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/33144c81-81c2-46b8-bf4c-234aa6c61ce5-kube-api-access-zjrnt" (OuterVolumeSpecName: "kube-api-access-zjrnt") pod "33144c81-81c2-46b8-bf4c-234aa6c61ce5" (UID: "33144c81-81c2-46b8-bf4c-234aa6c61ce5"). InnerVolumeSpecName "kube-api-access-zjrnt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.184992 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.241779 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shxc5\" (UniqueName: \"kubernetes.io/projected/afd2cf5e-082e-49d8-b25c-8d35faa7e529-kube-api-access-shxc5\") pod \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.241890 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-client-ca\") pod \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.241984 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2cf5e-082e-49d8-b25c-8d35faa7e529-serving-cert\") pod \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.242055 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-proxy-ca-bundles\") pod \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.242090 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-config\") pod \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\" (UID: \"afd2cf5e-082e-49d8-b25c-8d35faa7e529\") " Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.242346 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.242357 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/33144c81-81c2-46b8-bf4c-234aa6c61ce5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.242367 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zjrnt\" (UniqueName: \"kubernetes.io/projected/33144c81-81c2-46b8-bf4c-234aa6c61ce5-kube-api-access-zjrnt\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.242395 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/33144c81-81c2-46b8-bf4c-234aa6c61ce5-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.243347 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-client-ca" (OuterVolumeSpecName: "client-ca") pod "afd2cf5e-082e-49d8-b25c-8d35faa7e529" (UID: "afd2cf5e-082e-49d8-b25c-8d35faa7e529"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.243417 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "afd2cf5e-082e-49d8-b25c-8d35faa7e529" (UID: "afd2cf5e-082e-49d8-b25c-8d35faa7e529"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.243455 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-config" (OuterVolumeSpecName: "config") pod "afd2cf5e-082e-49d8-b25c-8d35faa7e529" (UID: "afd2cf5e-082e-49d8-b25c-8d35faa7e529"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.246245 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/afd2cf5e-082e-49d8-b25c-8d35faa7e529-kube-api-access-shxc5" (OuterVolumeSpecName: "kube-api-access-shxc5") pod "afd2cf5e-082e-49d8-b25c-8d35faa7e529" (UID: "afd2cf5e-082e-49d8-b25c-8d35faa7e529"). InnerVolumeSpecName "kube-api-access-shxc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.246819 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/afd2cf5e-082e-49d8-b25c-8d35faa7e529-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "afd2cf5e-082e-49d8-b25c-8d35faa7e529" (UID: "afd2cf5e-082e-49d8-b25c-8d35faa7e529"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.311189 4958 generic.go:334] "Generic (PLEG): container finished" podID="33144c81-81c2-46b8-bf4c-234aa6c61ce5" containerID="4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286" exitCode=0 Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.311238 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.311240 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" event={"ID":"33144c81-81c2-46b8-bf4c-234aa6c61ce5","Type":"ContainerDied","Data":"4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286"} Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.311287 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7" event={"ID":"33144c81-81c2-46b8-bf4c-234aa6c61ce5","Type":"ContainerDied","Data":"8d5783a9a9ffc015b4d619ad4b60adb9e00339bf3d91be82fcb4c5670bce7dff"} Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.311306 4958 scope.go:117] "RemoveContainer" containerID="4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.313468 4958 generic.go:334] "Generic (PLEG): container finished" podID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" containerID="f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156" exitCode=0 Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.313503 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" event={"ID":"afd2cf5e-082e-49d8-b25c-8d35faa7e529","Type":"ContainerDied","Data":"f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156"} Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.313524 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" event={"ID":"afd2cf5e-082e-49d8-b25c-8d35faa7e529","Type":"ContainerDied","Data":"1592fd068fc465109eeb2196754478e2a4489f157492c9fbdae6de896b5912dd"} Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.313585 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dcd97b9b-75db9" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.343214 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.343250 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.343262 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shxc5\" (UniqueName: \"kubernetes.io/projected/afd2cf5e-082e-49d8-b25c-8d35faa7e529-kube-api-access-shxc5\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.343275 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/afd2cf5e-082e-49d8-b25c-8d35faa7e529-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.343284 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/afd2cf5e-082e-49d8-b25c-8d35faa7e529-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.344866 4958 scope.go:117] "RemoveContainer" containerID="4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286" Mar 20 09:05:17 crc kubenswrapper[4958]: E0320 09:05:17.345637 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286\": container with ID starting with 4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286 not found: ID does not exist" containerID="4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.345689 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286"} err="failed to get container status \"4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286\": rpc error: code = NotFound desc = could not find container \"4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286\": container with ID starting with 4df6de4e2d9861b94b3ab23f6d09a78dab5c7bf19a13f060ca47ce2a8a28b286 not found: ID does not exist" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.345724 4958 scope.go:117] "RemoveContainer" containerID="f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.346725 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7"] Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.362444 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-zwll7"] Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.365801 4958 scope.go:117] "RemoveContainer" containerID="f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156" Mar 20 09:05:17 crc kubenswrapper[4958]: E0320 09:05:17.366413 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156\": container with ID starting with f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156 not found: ID does not exist" containerID="f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.366475 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156"} err="failed to get container status \"f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156\": rpc error: code = NotFound desc = could not find container \"f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156\": container with ID starting with f82e3f8f818c964a625080e545826cc88707b94c6d053ab25c9c41a0c9686156 not found: ID does not exist" Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.378448 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-75db9"] Mar 20 09:05:17 crc kubenswrapper[4958]: I0320 09:05:17.397136 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-75db9"] Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.175681 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-9bnt7"] Mar 20 09:05:18 crc kubenswrapper[4958]: E0320 09:05:18.178477 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" containerName="controller-manager" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.180012 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" containerName="controller-manager" Mar 20 09:05:18 crc kubenswrapper[4958]: E0320 09:05:18.180180 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.180321 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 09:05:18 crc kubenswrapper[4958]: E0320 09:05:18.180492 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="33144c81-81c2-46b8-bf4c-234aa6c61ce5" containerName="route-controller-manager" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.180759 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="33144c81-81c2-46b8-bf4c-234aa6c61ce5" containerName="route-controller-manager" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.181462 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="33144c81-81c2-46b8-bf4c-234aa6c61ce5" containerName="route-controller-manager" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.181660 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.182003 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" containerName="controller-manager" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.184837 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf"] Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.185139 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.187023 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.189023 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-9bnt7"] Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.193048 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.193901 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf"] Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.194319 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.194552 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.194805 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.195218 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.195352 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.195557 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.195736 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.195876 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.196011 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.196220 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.196352 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.199096 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.261876 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-config\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.261947 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b566g\" (UniqueName: \"kubernetes.io/projected/8d842e09-5b6f-4f7a-b962-367a09f87d73-kube-api-access-b566g\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.261984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-config\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.262017 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-client-ca\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.262155 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-kube-api-access-snvs8\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.262183 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-serving-cert\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.262233 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-proxy-ca-bundles\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.262266 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-client-ca\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.262421 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d842e09-5b6f-4f7a-b962-367a09f87d73-serving-cert\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.364641 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-config\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.364776 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-client-ca\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.364857 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-serving-cert\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.364903 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-kube-api-access-snvs8\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.365016 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-proxy-ca-bundles\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.365097 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-client-ca\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.365147 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d842e09-5b6f-4f7a-b962-367a09f87d73-serving-cert\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.365207 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-config\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.365291 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b566g\" (UniqueName: \"kubernetes.io/projected/8d842e09-5b6f-4f7a-b962-367a09f87d73-kube-api-access-b566g\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.366468 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-client-ca\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.367457 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-proxy-ca-bundles\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.368096 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-config\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.369107 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-client-ca\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.371733 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-serving-cert\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.371744 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-config\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.371973 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d842e09-5b6f-4f7a-b962-367a09f87d73-serving-cert\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.392971 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b566g\" (UniqueName: \"kubernetes.io/projected/8d842e09-5b6f-4f7a-b962-367a09f87d73-kube-api-access-b566g\") pod \"controller-manager-67c9f74866-9bnt7\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.393276 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-kube-api-access-snvs8\") pod \"route-controller-manager-6d6b97b7c-gwfkf\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.442270 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="33144c81-81c2-46b8-bf4c-234aa6c61ce5" path="/var/lib/kubelet/pods/33144c81-81c2-46b8-bf4c-234aa6c61ce5/volumes" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.442838 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="afd2cf5e-082e-49d8-b25c-8d35faa7e529" path="/var/lib/kubelet/pods/afd2cf5e-082e-49d8-b25c-8d35faa7e529/volumes" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.528532 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.533751 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.748763 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf"] Mar 20 09:05:18 crc kubenswrapper[4958]: W0320 09:05:18.766693 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb58ca0fd_f06f_44ee_a1da_80cf9d6b1c9c.slice/crio-b7984c9af6f55c1580b04df107e1ee8137e5c67885549d7ba062c91d661bfc97 WatchSource:0}: Error finding container b7984c9af6f55c1580b04df107e1ee8137e5c67885549d7ba062c91d661bfc97: Status 404 returned error can't find the container with id b7984c9af6f55c1580b04df107e1ee8137e5c67885549d7ba062c91d661bfc97 Mar 20 09:05:18 crc kubenswrapper[4958]: W0320 09:05:18.810348 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d842e09_5b6f_4f7a_b962_367a09f87d73.slice/crio-a06f2a2bf2640e827093faa8bb83096037d6f722ed1d9df6982c28c84eeb4302 WatchSource:0}: Error finding container a06f2a2bf2640e827093faa8bb83096037d6f722ed1d9df6982c28c84eeb4302: Status 404 returned error can't find the container with id a06f2a2bf2640e827093faa8bb83096037d6f722ed1d9df6982c28c84eeb4302 Mar 20 09:05:18 crc kubenswrapper[4958]: I0320 09:05:18.855994 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-9bnt7"] Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.331549 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" event={"ID":"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c","Type":"ContainerStarted","Data":"66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b"} Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.331614 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" event={"ID":"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c","Type":"ContainerStarted","Data":"b7984c9af6f55c1580b04df107e1ee8137e5c67885549d7ba062c91d661bfc97"} Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.332627 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.337673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" event={"ID":"8d842e09-5b6f-4f7a-b962-367a09f87d73","Type":"ContainerStarted","Data":"e653cee3baf0db8ed7703b85bf37017571a4032c1e8fb5e369d3e357af15683a"} Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.337723 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" event={"ID":"8d842e09-5b6f-4f7a-b962-367a09f87d73","Type":"ContainerStarted","Data":"a06f2a2bf2640e827093faa8bb83096037d6f722ed1d9df6982c28c84eeb4302"} Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.338568 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.342979 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.349763 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" podStartSLOduration=3.349740767 podStartE2EDuration="3.349740767s" podCreationTimestamp="2026-03-20 09:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:19.347863003 +0000 UTC m=+339.669878981" watchObservedRunningTime="2026-03-20 09:05:19.349740767 +0000 UTC m=+339.671756725" Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.366342 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" podStartSLOduration=3.366316473 podStartE2EDuration="3.366316473s" podCreationTimestamp="2026-03-20 09:05:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:19.363348239 +0000 UTC m=+339.685364207" watchObservedRunningTime="2026-03-20 09:05:19.366316473 +0000 UTC m=+339.688332431" Mar 20 09:05:19 crc kubenswrapper[4958]: I0320 09:05:19.758158 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.171009 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xwld"] Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.173248 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9xwld" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="registry-server" containerID="cri-o://5dd344b339bab4611f3abc5ecba6047e2b5ae1eddda5fbba4934c898d7451834" gracePeriod=2 Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.364740 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smdkg"] Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.365174 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smdkg" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="registry-server" containerID="cri-o://d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a" gracePeriod=2 Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.413646 4958 generic.go:334] "Generic (PLEG): container finished" podID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerID="5dd344b339bab4611f3abc5ecba6047e2b5ae1eddda5fbba4934c898d7451834" exitCode=0 Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.413705 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerDied","Data":"5dd344b339bab4611f3abc5ecba6047e2b5ae1eddda5fbba4934c898d7451834"} Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.739883 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.760914 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bsgc6\" (UniqueName: \"kubernetes.io/projected/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-kube-api-access-bsgc6\") pod \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.761037 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-utilities\") pod \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.761154 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-catalog-content\") pod \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\" (UID: \"f21e8593-4125-4ea1-ad7f-be4bb994ed6e\") " Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.761950 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-utilities" (OuterVolumeSpecName: "utilities") pod "f21e8593-4125-4ea1-ad7f-be4bb994ed6e" (UID: "f21e8593-4125-4ea1-ad7f-be4bb994ed6e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.771945 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-kube-api-access-bsgc6" (OuterVolumeSpecName: "kube-api-access-bsgc6") pod "f21e8593-4125-4ea1-ad7f-be4bb994ed6e" (UID: "f21e8593-4125-4ea1-ad7f-be4bb994ed6e"). InnerVolumeSpecName "kube-api-access-bsgc6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.807551 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f21e8593-4125-4ea1-ad7f-be4bb994ed6e" (UID: "f21e8593-4125-4ea1-ad7f-be4bb994ed6e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.847692 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.862508 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.862559 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.862569 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bsgc6\" (UniqueName: \"kubernetes.io/projected/f21e8593-4125-4ea1-ad7f-be4bb994ed6e-kube-api-access-bsgc6\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.963854 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-utilities\") pod \"faa90514-f83a-442b-9d17-08ff904728f2\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.964399 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-catalog-content\") pod \"faa90514-f83a-442b-9d17-08ff904728f2\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.964509 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2fps\" (UniqueName: \"kubernetes.io/projected/faa90514-f83a-442b-9d17-08ff904728f2-kube-api-access-q2fps\") pod \"faa90514-f83a-442b-9d17-08ff904728f2\" (UID: \"faa90514-f83a-442b-9d17-08ff904728f2\") " Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.964624 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-utilities" (OuterVolumeSpecName: "utilities") pod "faa90514-f83a-442b-9d17-08ff904728f2" (UID: "faa90514-f83a-442b-9d17-08ff904728f2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.964892 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:30 crc kubenswrapper[4958]: I0320 09:05:30.967646 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/faa90514-f83a-442b-9d17-08ff904728f2-kube-api-access-q2fps" (OuterVolumeSpecName: "kube-api-access-q2fps") pod "faa90514-f83a-442b-9d17-08ff904728f2" (UID: "faa90514-f83a-442b-9d17-08ff904728f2"). InnerVolumeSpecName "kube-api-access-q2fps". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.066658 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2fps\" (UniqueName: \"kubernetes.io/projected/faa90514-f83a-442b-9d17-08ff904728f2-kube-api-access-q2fps\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.131764 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "faa90514-f83a-442b-9d17-08ff904728f2" (UID: "faa90514-f83a-442b-9d17-08ff904728f2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.172648 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/faa90514-f83a-442b-9d17-08ff904728f2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.421483 4958 generic.go:334] "Generic (PLEG): container finished" podID="faa90514-f83a-442b-9d17-08ff904728f2" containerID="d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a" exitCode=0 Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.421573 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerDied","Data":"d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a"} Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.421637 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smdkg" event={"ID":"faa90514-f83a-442b-9d17-08ff904728f2","Type":"ContainerDied","Data":"ff00aded18fe65038227f78eab1ededad4551f257fe3c0f805cab24c97bce612"} Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.421664 4958 scope.go:117] "RemoveContainer" containerID="d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.421833 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smdkg" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.431817 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9xwld" event={"ID":"f21e8593-4125-4ea1-ad7f-be4bb994ed6e","Type":"ContainerDied","Data":"c19db44c2cc9ae35a82449d1efe2d336b7972ed261ac04c8b7f132a57184ccf1"} Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.432108 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9xwld" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.454983 4958 scope.go:117] "RemoveContainer" containerID="863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.470474 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xwld"] Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.478748 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9xwld"] Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.481918 4958 scope.go:117] "RemoveContainer" containerID="07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.483173 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smdkg"] Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.487044 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smdkg"] Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.495896 4958 scope.go:117] "RemoveContainer" containerID="d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a" Mar 20 09:05:31 crc kubenswrapper[4958]: E0320 09:05:31.496367 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a\": container with ID starting with d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a not found: ID does not exist" containerID="d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.496420 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a"} err="failed to get container status \"d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a\": rpc error: code = NotFound desc = could not find container \"d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a\": container with ID starting with d3484cd205c1f2df80f0184341582e8f06ef4609b9b130b06caf719318ba792a not found: ID does not exist" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.496453 4958 scope.go:117] "RemoveContainer" containerID="863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96" Mar 20 09:05:31 crc kubenswrapper[4958]: E0320 09:05:31.499854 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96\": container with ID starting with 863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96 not found: ID does not exist" containerID="863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.499916 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96"} err="failed to get container status \"863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96\": rpc error: code = NotFound desc = could not find container \"863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96\": container with ID starting with 863ef9ded0c38dff66d258c9e23c8dba625bc606f792e988bce8999403917e96 not found: ID does not exist" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.499949 4958 scope.go:117] "RemoveContainer" containerID="07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241" Mar 20 09:05:31 crc kubenswrapper[4958]: E0320 09:05:31.500248 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241\": container with ID starting with 07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241 not found: ID does not exist" containerID="07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.500282 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241"} err="failed to get container status \"07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241\": rpc error: code = NotFound desc = could not find container \"07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241\": container with ID starting with 07a7fe77b302e981728d3aa82530c26675fa63ec4fb5497181e94c29742c2241 not found: ID does not exist" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.500306 4958 scope.go:117] "RemoveContainer" containerID="5dd344b339bab4611f3abc5ecba6047e2b5ae1eddda5fbba4934c898d7451834" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.512661 4958 scope.go:117] "RemoveContainer" containerID="c4fd84794f34339505882babeedccf0842923e38187c141049a17bb5913860b5" Mar 20 09:05:31 crc kubenswrapper[4958]: I0320 09:05:31.526708 4958 scope.go:117] "RemoveContainer" containerID="395afe424d1d4901498ff41ef21c320b812e38a35d0662178cd19fee2806bf1d" Mar 20 09:05:32 crc kubenswrapper[4958]: I0320 09:05:32.444625 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" path="/var/lib/kubelet/pods/f21e8593-4125-4ea1-ad7f-be4bb994ed6e/volumes" Mar 20 09:05:32 crc kubenswrapper[4958]: I0320 09:05:32.445458 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="faa90514-f83a-442b-9d17-08ff904728f2" path="/var/lib/kubelet/pods/faa90514-f83a-442b-9d17-08ff904728f2/volumes" Mar 20 09:05:56 crc kubenswrapper[4958]: I0320 09:05:56.544678 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf"] Mar 20 09:05:56 crc kubenswrapper[4958]: I0320 09:05:56.545882 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" podUID="b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" containerName="route-controller-manager" containerID="cri-o://66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b" gracePeriod=30 Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.022870 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.059114 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-client-ca\") pod \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.059268 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-serving-cert\") pod \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.059348 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-config\") pod \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.059390 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-kube-api-access-snvs8\") pod \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\" (UID: \"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c\") " Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.061005 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-client-ca" (OuterVolumeSpecName: "client-ca") pod "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" (UID: "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.061100 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-config" (OuterVolumeSpecName: "config") pod "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" (UID: "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.068829 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" (UID: "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.069442 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-kube-api-access-snvs8" (OuterVolumeSpecName: "kube-api-access-snvs8") pod "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" (UID: "b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c"). InnerVolumeSpecName "kube-api-access-snvs8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.160677 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.160735 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.160753 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.160770 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snvs8\" (UniqueName: \"kubernetes.io/projected/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c-kube-api-access-snvs8\") on node \"crc\" DevicePath \"\"" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.633579 4958 generic.go:334] "Generic (PLEG): container finished" podID="b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" containerID="66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b" exitCode=0 Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.634192 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" event={"ID":"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c","Type":"ContainerDied","Data":"66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b"} Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.634232 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" event={"ID":"b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c","Type":"ContainerDied","Data":"b7984c9af6f55c1580b04df107e1ee8137e5c67885549d7ba062c91d661bfc97"} Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.634255 4958 scope.go:117] "RemoveContainer" containerID="66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.634431 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.654156 4958 scope.go:117] "RemoveContainer" containerID="66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b" Mar 20 09:05:57 crc kubenswrapper[4958]: E0320 09:05:57.655476 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b\": container with ID starting with 66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b not found: ID does not exist" containerID="66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.655534 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b"} err="failed to get container status \"66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b\": rpc error: code = NotFound desc = could not find container \"66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b\": container with ID starting with 66d828d4904cb1fde6e60fada321b5f7d0976e0f29ff03e4b8508279797cc13b not found: ID does not exist" Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.666759 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf"] Mar 20 09:05:57 crc kubenswrapper[4958]: I0320 09:05:57.670187 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d6b97b7c-gwfkf"] Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.213842 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt"] Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.214522 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="registry-server" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.214690 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="registry-server" Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.214767 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="extract-utilities" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.214836 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="extract-utilities" Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.214901 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" containerName="route-controller-manager" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.214955 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" containerName="route-controller-manager" Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.215015 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="extract-utilities" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215070 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="extract-utilities" Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.215127 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="extract-content" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215239 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="extract-content" Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.215314 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="extract-content" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215374 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="extract-content" Mar 20 09:05:58 crc kubenswrapper[4958]: E0320 09:05:58.215436 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="registry-server" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215489 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="registry-server" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215736 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f21e8593-4125-4ea1-ad7f-be4bb994ed6e" containerName="registry-server" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215827 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" containerName="route-controller-manager" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.215886 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="faa90514-f83a-442b-9d17-08ff904728f2" containerName="registry-server" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.216573 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.218418 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt"] Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.224482 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.224739 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.224884 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.224525 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.225197 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.225416 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.283868 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qhb6t\" (UniqueName: \"kubernetes.io/projected/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-kube-api-access-qhb6t\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.284269 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-client-ca\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.284373 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-config\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.284470 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-serving-cert\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.386107 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qhb6t\" (UniqueName: \"kubernetes.io/projected/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-kube-api-access-qhb6t\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.386167 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-client-ca\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.386199 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-config\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.386224 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-serving-cert\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.387621 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-client-ca\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.388068 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-config\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.395723 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-serving-cert\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.408259 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qhb6t\" (UniqueName: \"kubernetes.io/projected/6c4d8272-19ce-4972-b2a3-519b9bfbaee6-kube-api-access-qhb6t\") pod \"route-controller-manager-55f4f49fdf-lxbtt\" (UID: \"6c4d8272-19ce-4972-b2a3-519b9bfbaee6\") " pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.444929 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c" path="/var/lib/kubelet/pods/b58ca0fd-f06f-44ee-a1da-80cf9d6b1c9c/volumes" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.540801 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:58 crc kubenswrapper[4958]: I0320 09:05:58.973241 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt"] Mar 20 09:05:59 crc kubenswrapper[4958]: I0320 09:05:59.668076 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" event={"ID":"6c4d8272-19ce-4972-b2a3-519b9bfbaee6","Type":"ContainerStarted","Data":"1558b32fac3a6318ed1414bcc2061004707151078c8a1cf970ded8b5d96b07f3"} Mar 20 09:05:59 crc kubenswrapper[4958]: I0320 09:05:59.668131 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" event={"ID":"6c4d8272-19ce-4972-b2a3-519b9bfbaee6","Type":"ContainerStarted","Data":"9c7ee40d7e40fc03d4d55218bc1328efcf28c78ff273844fb8a4d428596a1ba0"} Mar 20 09:05:59 crc kubenswrapper[4958]: I0320 09:05:59.668335 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:59 crc kubenswrapper[4958]: I0320 09:05:59.674850 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" Mar 20 09:05:59 crc kubenswrapper[4958]: I0320 09:05:59.688304 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55f4f49fdf-lxbtt" podStartSLOduration=3.6882900899999997 podStartE2EDuration="3.68829009s" podCreationTimestamp="2026-03-20 09:05:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:05:59.685898762 +0000 UTC m=+380.007914720" watchObservedRunningTime="2026-03-20 09:05:59.68829009 +0000 UTC m=+380.010306038" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.194412 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566626-k6brk"] Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.195455 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.197747 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.197928 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.198144 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.206859 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-k6brk"] Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.316424 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7ndm9\" (UniqueName: \"kubernetes.io/projected/c37025e7-c9ef-4f2b-bddd-fe015cb30722-kube-api-access-7ndm9\") pod \"auto-csr-approver-29566626-k6brk\" (UID: \"c37025e7-c9ef-4f2b-bddd-fe015cb30722\") " pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.418384 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7ndm9\" (UniqueName: \"kubernetes.io/projected/c37025e7-c9ef-4f2b-bddd-fe015cb30722-kube-api-access-7ndm9\") pod \"auto-csr-approver-29566626-k6brk\" (UID: \"c37025e7-c9ef-4f2b-bddd-fe015cb30722\") " pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.442342 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7ndm9\" (UniqueName: \"kubernetes.io/projected/c37025e7-c9ef-4f2b-bddd-fe015cb30722-kube-api-access-7ndm9\") pod \"auto-csr-approver-29566626-k6brk\" (UID: \"c37025e7-c9ef-4f2b-bddd-fe015cb30722\") " pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:00 crc kubenswrapper[4958]: I0320 09:06:00.514411 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:01 crc kubenswrapper[4958]: I0320 09:06:01.005297 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-k6brk"] Mar 20 09:06:01 crc kubenswrapper[4958]: I0320 09:06:01.685467 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-k6brk" event={"ID":"c37025e7-c9ef-4f2b-bddd-fe015cb30722","Type":"ContainerStarted","Data":"405fa5e0f60ead49dc93ea486850ad6c54aa4fed6c627524542dfdd668b69787"} Mar 20 09:06:02 crc kubenswrapper[4958]: I0320 09:06:02.694469 4958 generic.go:334] "Generic (PLEG): container finished" podID="c37025e7-c9ef-4f2b-bddd-fe015cb30722" containerID="fc2bb1acaf8b13cd480fc90bd5409f5f4e2efab85cee97a0a77c863d31245fa2" exitCode=0 Mar 20 09:06:02 crc kubenswrapper[4958]: I0320 09:06:02.694537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-k6brk" event={"ID":"c37025e7-c9ef-4f2b-bddd-fe015cb30722","Type":"ContainerDied","Data":"fc2bb1acaf8b13cd480fc90bd5409f5f4e2efab85cee97a0a77c863d31245fa2"} Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.028221 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.097961 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7ndm9\" (UniqueName: \"kubernetes.io/projected/c37025e7-c9ef-4f2b-bddd-fe015cb30722-kube-api-access-7ndm9\") pod \"c37025e7-c9ef-4f2b-bddd-fe015cb30722\" (UID: \"c37025e7-c9ef-4f2b-bddd-fe015cb30722\") " Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.106551 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c37025e7-c9ef-4f2b-bddd-fe015cb30722-kube-api-access-7ndm9" (OuterVolumeSpecName: "kube-api-access-7ndm9") pod "c37025e7-c9ef-4f2b-bddd-fe015cb30722" (UID: "c37025e7-c9ef-4f2b-bddd-fe015cb30722"). InnerVolumeSpecName "kube-api-access-7ndm9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.200754 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7ndm9\" (UniqueName: \"kubernetes.io/projected/c37025e7-c9ef-4f2b-bddd-fe015cb30722-kube-api-access-7ndm9\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.713156 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566626-k6brk" event={"ID":"c37025e7-c9ef-4f2b-bddd-fe015cb30722","Type":"ContainerDied","Data":"405fa5e0f60ead49dc93ea486850ad6c54aa4fed6c627524542dfdd668b69787"} Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.713210 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="405fa5e0f60ead49dc93ea486850ad6c54aa4fed6c627524542dfdd668b69787" Mar 20 09:06:04 crc kubenswrapper[4958]: I0320 09:06:04.713249 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566626-k6brk" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.676542 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-trn9v"] Mar 20 09:06:16 crc kubenswrapper[4958]: E0320 09:06:16.677796 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c37025e7-c9ef-4f2b-bddd-fe015cb30722" containerName="oc" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.677814 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c37025e7-c9ef-4f2b-bddd-fe015cb30722" containerName="oc" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.677943 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c37025e7-c9ef-4f2b-bddd-fe015cb30722" containerName="oc" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.678587 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.695425 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-trn9v"] Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.801336 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/739ce2fb-59f4-45c6-88d6-e58d3a01682a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.801921 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/739ce2fb-59f4-45c6-88d6-e58d3a01682a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.801989 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wn5r\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-kube-api-access-4wn5r\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.802017 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/739ce2fb-59f4-45c6-88d6-e58d3a01682a-registry-certificates\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.802049 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-bound-sa-token\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.802070 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-registry-tls\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.802090 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/739ce2fb-59f4-45c6-88d6-e58d3a01682a-trusted-ca\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.802180 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.842157 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.904135 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-registry-tls\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.904546 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/739ce2fb-59f4-45c6-88d6-e58d3a01682a-trusted-ca\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.904752 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/739ce2fb-59f4-45c6-88d6-e58d3a01682a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.904903 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/739ce2fb-59f4-45c6-88d6-e58d3a01682a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.905047 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wn5r\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-kube-api-access-4wn5r\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.905195 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/739ce2fb-59f4-45c6-88d6-e58d3a01682a-registry-certificates\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.906368 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-bound-sa-token\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.905337 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/739ce2fb-59f4-45c6-88d6-e58d3a01682a-ca-trust-extracted\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.906021 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/739ce2fb-59f4-45c6-88d6-e58d3a01682a-trusted-ca\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.906893 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/739ce2fb-59f4-45c6-88d6-e58d3a01682a-registry-certificates\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.911729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/739ce2fb-59f4-45c6-88d6-e58d3a01682a-installation-pull-secrets\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.912032 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-registry-tls\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.921547 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wn5r\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-kube-api-access-4wn5r\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.929953 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/739ce2fb-59f4-45c6-88d6-e58d3a01682a-bound-sa-token\") pod \"image-registry-66df7c8f76-trn9v\" (UID: \"739ce2fb-59f4-45c6-88d6-e58d3a01682a\") " pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:16 crc kubenswrapper[4958]: I0320 09:06:16.995019 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:17 crc kubenswrapper[4958]: I0320 09:06:17.248993 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-trn9v"] Mar 20 09:06:17 crc kubenswrapper[4958]: W0320 09:06:17.256055 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod739ce2fb_59f4_45c6_88d6_e58d3a01682a.slice/crio-31f2ee38cae28f8c7aa78204ebf640622acdf5b5530b5d85af7eb19c9e346fe7 WatchSource:0}: Error finding container 31f2ee38cae28f8c7aa78204ebf640622acdf5b5530b5d85af7eb19c9e346fe7: Status 404 returned error can't find the container with id 31f2ee38cae28f8c7aa78204ebf640622acdf5b5530b5d85af7eb19c9e346fe7 Mar 20 09:06:17 crc kubenswrapper[4958]: I0320 09:06:17.791736 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" event={"ID":"739ce2fb-59f4-45c6-88d6-e58d3a01682a","Type":"ContainerStarted","Data":"635b8a712bfe18fcafaf52e87bef9df2915cb95613a8a94ef59edc362c5b7a40"} Mar 20 09:06:17 crc kubenswrapper[4958]: I0320 09:06:17.791797 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" event={"ID":"739ce2fb-59f4-45c6-88d6-e58d3a01682a","Type":"ContainerStarted","Data":"31f2ee38cae28f8c7aa78204ebf640622acdf5b5530b5d85af7eb19c9e346fe7"} Mar 20 09:06:17 crc kubenswrapper[4958]: I0320 09:06:17.791899 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:17 crc kubenswrapper[4958]: I0320 09:06:17.817562 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" podStartSLOduration=1.817525397 podStartE2EDuration="1.817525397s" podCreationTimestamp="2026-03-20 09:06:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:17.813680644 +0000 UTC m=+398.135696622" watchObservedRunningTime="2026-03-20 09:06:17.817525397 +0000 UTC m=+398.139541355" Mar 20 09:06:26 crc kubenswrapper[4958]: I0320 09:06:26.521811 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:06:26 crc kubenswrapper[4958]: I0320 09:06:26.522375 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:06:36 crc kubenswrapper[4958]: I0320 09:06:36.548828 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-9bnt7"] Mar 20 09:06:36 crc kubenswrapper[4958]: I0320 09:06:36.549995 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" podUID="8d842e09-5b6f-4f7a-b962-367a09f87d73" containerName="controller-manager" containerID="cri-o://e653cee3baf0db8ed7703b85bf37017571a4032c1e8fb5e369d3e357af15683a" gracePeriod=30 Mar 20 09:06:36 crc kubenswrapper[4958]: I0320 09:06:36.911970 4958 generic.go:334] "Generic (PLEG): container finished" podID="8d842e09-5b6f-4f7a-b962-367a09f87d73" containerID="e653cee3baf0db8ed7703b85bf37017571a4032c1e8fb5e369d3e357af15683a" exitCode=0 Mar 20 09:06:36 crc kubenswrapper[4958]: I0320 09:06:36.912084 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" event={"ID":"8d842e09-5b6f-4f7a-b962-367a09f87d73","Type":"ContainerDied","Data":"e653cee3baf0db8ed7703b85bf37017571a4032c1e8fb5e369d3e357af15683a"} Mar 20 09:06:36 crc kubenswrapper[4958]: I0320 09:06:36.962811 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.002007 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-trn9v" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.069700 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-flhr9"] Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.087792 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b566g\" (UniqueName: \"kubernetes.io/projected/8d842e09-5b6f-4f7a-b962-367a09f87d73-kube-api-access-b566g\") pod \"8d842e09-5b6f-4f7a-b962-367a09f87d73\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.087875 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-client-ca\") pod \"8d842e09-5b6f-4f7a-b962-367a09f87d73\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.087976 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-config\") pod \"8d842e09-5b6f-4f7a-b962-367a09f87d73\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.088073 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d842e09-5b6f-4f7a-b962-367a09f87d73-serving-cert\") pod \"8d842e09-5b6f-4f7a-b962-367a09f87d73\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.088123 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-proxy-ca-bundles\") pod \"8d842e09-5b6f-4f7a-b962-367a09f87d73\" (UID: \"8d842e09-5b6f-4f7a-b962-367a09f87d73\") " Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.090794 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-client-ca" (OuterVolumeSpecName: "client-ca") pod "8d842e09-5b6f-4f7a-b962-367a09f87d73" (UID: "8d842e09-5b6f-4f7a-b962-367a09f87d73"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.091503 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-config" (OuterVolumeSpecName: "config") pod "8d842e09-5b6f-4f7a-b962-367a09f87d73" (UID: "8d842e09-5b6f-4f7a-b962-367a09f87d73"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.092288 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "8d842e09-5b6f-4f7a-b962-367a09f87d73" (UID: "8d842e09-5b6f-4f7a-b962-367a09f87d73"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.097983 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d842e09-5b6f-4f7a-b962-367a09f87d73-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8d842e09-5b6f-4f7a-b962-367a09f87d73" (UID: "8d842e09-5b6f-4f7a-b962-367a09f87d73"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.098046 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d842e09-5b6f-4f7a-b962-367a09f87d73-kube-api-access-b566g" (OuterVolumeSpecName: "kube-api-access-b566g") pod "8d842e09-5b6f-4f7a-b962-367a09f87d73" (UID: "8d842e09-5b6f-4f7a-b962-367a09f87d73"). InnerVolumeSpecName "kube-api-access-b566g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.190703 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b566g\" (UniqueName: \"kubernetes.io/projected/8d842e09-5b6f-4f7a-b962-367a09f87d73-kube-api-access-b566g\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.190755 4958 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.190771 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.190783 4958 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8d842e09-5b6f-4f7a-b962-367a09f87d73-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.190795 4958 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/8d842e09-5b6f-4f7a-b962-367a09f87d73-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.936337 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" event={"ID":"8d842e09-5b6f-4f7a-b962-367a09f87d73","Type":"ContainerDied","Data":"a06f2a2bf2640e827093faa8bb83096037d6f722ed1d9df6982c28c84eeb4302"} Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.936854 4958 scope.go:117] "RemoveContainer" containerID="e653cee3baf0db8ed7703b85bf37017571a4032c1e8fb5e369d3e357af15683a" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.937031 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-67c9f74866-9bnt7" Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.985137 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-9bnt7"] Mar 20 09:06:37 crc kubenswrapper[4958]: I0320 09:06:37.992340 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-67c9f74866-9bnt7"] Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.231324 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4"] Mar 20 09:06:38 crc kubenswrapper[4958]: E0320 09:06:38.231759 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8d842e09-5b6f-4f7a-b962-367a09f87d73" containerName="controller-manager" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.231783 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d842e09-5b6f-4f7a-b962-367a09f87d73" containerName="controller-manager" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.231916 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8d842e09-5b6f-4f7a-b962-367a09f87d73" containerName="controller-manager" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.232473 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.235914 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.240355 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.240693 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.241299 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.241844 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.242034 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.248230 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4"] Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.252826 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.310321 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-client-ca\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.310376 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n5lkp\" (UniqueName: \"kubernetes.io/projected/3b282a24-2070-48d2-8de8-1693289ffd16-kube-api-access-n5lkp\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.310423 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-config\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.310555 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-proxy-ca-bundles\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.310624 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b282a24-2070-48d2-8de8-1693289ffd16-serving-cert\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.412844 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b282a24-2070-48d2-8de8-1693289ffd16-serving-cert\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.412913 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-client-ca\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.412934 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n5lkp\" (UniqueName: \"kubernetes.io/projected/3b282a24-2070-48d2-8de8-1693289ffd16-kube-api-access-n5lkp\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.412972 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-config\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.413015 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-proxy-ca-bundles\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.414366 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-proxy-ca-bundles\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.414785 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-client-ca\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.415247 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3b282a24-2070-48d2-8de8-1693289ffd16-config\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.421395 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3b282a24-2070-48d2-8de8-1693289ffd16-serving-cert\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.438483 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n5lkp\" (UniqueName: \"kubernetes.io/projected/3b282a24-2070-48d2-8de8-1693289ffd16-kube-api-access-n5lkp\") pod \"controller-manager-85dcd97b9b-dx2s4\" (UID: \"3b282a24-2070-48d2-8de8-1693289ffd16\") " pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.450774 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d842e09-5b6f-4f7a-b962-367a09f87d73" path="/var/lib/kubelet/pods/8d842e09-5b6f-4f7a-b962-367a09f87d73/volumes" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.556703 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.789093 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4"] Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.944346 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" event={"ID":"3b282a24-2070-48d2-8de8-1693289ffd16","Type":"ContainerStarted","Data":"731c111ce6829b1363cab27618b6f6e30a4c9ab598d7df1579927dc180987497"} Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.946064 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" event={"ID":"3b282a24-2070-48d2-8de8-1693289ffd16","Type":"ContainerStarted","Data":"341873a3b2d877f2fbea1828ed63b6979e4463dbbf4f14c23d7091fd6a85264b"} Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.946225 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.946517 4958 patch_prober.go:28] interesting pod/controller-manager-85dcd97b9b-dx2s4 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" start-of-body= Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.946646 4958 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" podUID="3b282a24-2070-48d2-8de8-1693289ffd16" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.73:8443/healthz\": dial tcp 10.217.0.73:8443: connect: connection refused" Mar 20 09:06:38 crc kubenswrapper[4958]: I0320 09:06:38.975900 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" podStartSLOduration=2.975869009 podStartE2EDuration="2.975869009s" podCreationTimestamp="2026-03-20 09:06:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:38.972093589 +0000 UTC m=+419.294109557" watchObservedRunningTime="2026-03-20 09:06:38.975869009 +0000 UTC m=+419.297884977" Mar 20 09:06:39 crc kubenswrapper[4958]: I0320 09:06:39.965161 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-85dcd97b9b-dx2s4" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.392762 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-549hv"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.394326 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-549hv" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="registry-server" containerID="cri-o://ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830" gracePeriod=30 Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.417835 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mpjsp"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.418182 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-mpjsp" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="registry-server" containerID="cri-o://1daa1aaf3b5fe03ebea9132c909cc38da98e4a17208c0b5b1ba83ee0358929b0" gracePeriod=30 Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.445710 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gwpt"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.446075 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" containerID="cri-o://6da2b3db01910ff5a949506b9f1fcd89db5d5dcbadc821a053bd820a24a7c37b" gracePeriod=30 Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.460519 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8j2r"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.461178 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-z8j2r" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="registry-server" containerID="cri-o://c7f214d447c87c57cf0d136d6a477d47b7637f0dfa344988ac59335bb40597b5" gracePeriod=30 Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.466902 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5nh9"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.467348 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p5nh9" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="registry-server" containerID="cri-o://202b742be89e34126fdc698910c0c455020ed04a9bf75db6b4a611df61c176d8" gracePeriod=30 Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.476199 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66h4r"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.477366 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.483063 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66h4r"] Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.604911 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a69577-98bd-420f-b49a-f004c20de1e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.604984 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a69577-98bd-420f-b49a-f004c20de1e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.605062 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jprdg\" (UniqueName: \"kubernetes.io/projected/36a69577-98bd-420f-b49a-f004c20de1e0-kube-api-access-jprdg\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.708320 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jprdg\" (UniqueName: \"kubernetes.io/projected/36a69577-98bd-420f-b49a-f004c20de1e0-kube-api-access-jprdg\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.708405 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a69577-98bd-420f-b49a-f004c20de1e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.708447 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a69577-98bd-420f-b49a-f004c20de1e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.716982 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/36a69577-98bd-420f-b49a-f004c20de1e0-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.724930 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/36a69577-98bd-420f-b49a-f004c20de1e0-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.743213 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jprdg\" (UniqueName: \"kubernetes.io/projected/36a69577-98bd-420f-b49a-f004c20de1e0-kube-api-access-jprdg\") pod \"marketplace-operator-79b997595-66h4r\" (UID: \"36a69577-98bd-420f-b49a-f004c20de1e0\") " pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.937827 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.957340 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.996557 4958 generic.go:334] "Generic (PLEG): container finished" podID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerID="ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830" exitCode=0 Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.996658 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerDied","Data":"ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830"} Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.996695 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-549hv" event={"ID":"fcb5229f-2b8f-4e6a-8542-cd03b84e9737","Type":"ContainerDied","Data":"cad3dbe1843341eaf6a0fdc589d7828ecba0489d05ce6107941e638cf6856f4b"} Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.996716 4958 scope.go:117] "RemoveContainer" containerID="ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830" Mar 20 09:06:42 crc kubenswrapper[4958]: I0320 09:06:42.996865 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-549hv" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.006415 4958 generic.go:334] "Generic (PLEG): container finished" podID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerID="c7f214d447c87c57cf0d136d6a477d47b7637f0dfa344988ac59335bb40597b5" exitCode=0 Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.006558 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerDied","Data":"c7f214d447c87c57cf0d136d6a477d47b7637f0dfa344988ac59335bb40597b5"} Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.008942 4958 generic.go:334] "Generic (PLEG): container finished" podID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerID="6da2b3db01910ff5a949506b9f1fcd89db5d5dcbadc821a053bd820a24a7c37b" exitCode=0 Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.009017 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" event={"ID":"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4","Type":"ContainerDied","Data":"6da2b3db01910ff5a949506b9f1fcd89db5d5dcbadc821a053bd820a24a7c37b"} Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.018045 4958 generic.go:334] "Generic (PLEG): container finished" podID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerID="1daa1aaf3b5fe03ebea9132c909cc38da98e4a17208c0b5b1ba83ee0358929b0" exitCode=0 Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.018144 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerDied","Data":"1daa1aaf3b5fe03ebea9132c909cc38da98e4a17208c0b5b1ba83ee0358929b0"} Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.023772 4958 generic.go:334] "Generic (PLEG): container finished" podID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerID="202b742be89e34126fdc698910c0c455020ed04a9bf75db6b4a611df61c176d8" exitCode=0 Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.023835 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerDied","Data":"202b742be89e34126fdc698910c0c455020ed04a9bf75db6b4a611df61c176d8"} Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.057367 4958 scope.go:117] "RemoveContainer" containerID="1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.116993 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-utilities\") pod \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.117102 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-catalog-content\") pod \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.117156 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bzbd9\" (UniqueName: \"kubernetes.io/projected/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-kube-api-access-bzbd9\") pod \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\" (UID: \"fcb5229f-2b8f-4e6a-8542-cd03b84e9737\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.118238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-utilities" (OuterVolumeSpecName: "utilities") pod "fcb5229f-2b8f-4e6a-8542-cd03b84e9737" (UID: "fcb5229f-2b8f-4e6a-8542-cd03b84e9737"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.124422 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-kube-api-access-bzbd9" (OuterVolumeSpecName: "kube-api-access-bzbd9") pod "fcb5229f-2b8f-4e6a-8542-cd03b84e9737" (UID: "fcb5229f-2b8f-4e6a-8542-cd03b84e9737"). InnerVolumeSpecName "kube-api-access-bzbd9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.148236 4958 scope.go:117] "RemoveContainer" containerID="8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.199209 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.215184 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.215522 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.218988 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bzbd9\" (UniqueName: \"kubernetes.io/projected/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-kube-api-access-bzbd9\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.219010 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.233222 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fcb5229f-2b8f-4e6a-8542-cd03b84e9737" (UID: "fcb5229f-2b8f-4e6a-8542-cd03b84e9737"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.237458 4958 scope.go:117] "RemoveContainer" containerID="ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830" Mar 20 09:06:43 crc kubenswrapper[4958]: E0320 09:06:43.238442 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830\": container with ID starting with ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830 not found: ID does not exist" containerID="ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.238479 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830"} err="failed to get container status \"ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830\": rpc error: code = NotFound desc = could not find container \"ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830\": container with ID starting with ff58cc81ca8d3e92db8cc0f5ce691aa64952eac11f672a6cdf9af667a1fa1830 not found: ID does not exist" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.238508 4958 scope.go:117] "RemoveContainer" containerID="1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a" Mar 20 09:06:43 crc kubenswrapper[4958]: E0320 09:06:43.240516 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a\": container with ID starting with 1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a not found: ID does not exist" containerID="1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.240567 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a"} err="failed to get container status \"1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a\": rpc error: code = NotFound desc = could not find container \"1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a\": container with ID starting with 1660f3286d8d61acd677c1f41070256c06811efe02aebe9a64674b19e9a01c4a not found: ID does not exist" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.240626 4958 scope.go:117] "RemoveContainer" containerID="8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f" Mar 20 09:06:43 crc kubenswrapper[4958]: E0320 09:06:43.243049 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f\": container with ID starting with 8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f not found: ID does not exist" containerID="8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.243088 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f"} err="failed to get container status \"8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f\": rpc error: code = NotFound desc = could not find container \"8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f\": container with ID starting with 8f14423037a360e71c00d8817da58b978ba5058037bfc0987d5cf2d07e867b8f not found: ID does not exist" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.285399 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.319865 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-utilities\") pod \"c97ca1fb-e042-4273-b024-bc9dbc806359\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.320545 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-utilities\") pod \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.321053 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-catalog-content\") pod \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.321199 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-catalog-content\") pod \"c97ca1fb-e042-4273-b024-bc9dbc806359\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.322255 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l74ms\" (UniqueName: \"kubernetes.io/projected/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-kube-api-access-l74ms\") pod \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\" (UID: \"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.322414 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-catalog-content\") pod \"1301d3a7-31fd-44f4-825d-a579e4026c7a\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.322537 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-utilities\") pod \"1301d3a7-31fd-44f4-825d-a579e4026c7a\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.322655 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8kh6\" (UniqueName: \"kubernetes.io/projected/c97ca1fb-e042-4273-b024-bc9dbc806359-kube-api-access-z8kh6\") pod \"c97ca1fb-e042-4273-b024-bc9dbc806359\" (UID: \"c97ca1fb-e042-4273-b024-bc9dbc806359\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.322773 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlt8l\" (UniqueName: \"kubernetes.io/projected/1301d3a7-31fd-44f4-825d-a579e4026c7a-kube-api-access-jlt8l\") pod \"1301d3a7-31fd-44f4-825d-a579e4026c7a\" (UID: \"1301d3a7-31fd-44f4-825d-a579e4026c7a\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.323391 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fcb5229f-2b8f-4e6a-8542-cd03b84e9737-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.323505 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-utilities" (OuterVolumeSpecName: "utilities") pod "1301d3a7-31fd-44f4-825d-a579e4026c7a" (UID: "1301d3a7-31fd-44f4-825d-a579e4026c7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.325819 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-utilities" (OuterVolumeSpecName: "utilities") pod "ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" (UID: "ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.327448 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-utilities" (OuterVolumeSpecName: "utilities") pod "c97ca1fb-e042-4273-b024-bc9dbc806359" (UID: "c97ca1fb-e042-4273-b024-bc9dbc806359"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.330159 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c97ca1fb-e042-4273-b024-bc9dbc806359-kube-api-access-z8kh6" (OuterVolumeSpecName: "kube-api-access-z8kh6") pod "c97ca1fb-e042-4273-b024-bc9dbc806359" (UID: "c97ca1fb-e042-4273-b024-bc9dbc806359"). InnerVolumeSpecName "kube-api-access-z8kh6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.334719 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-kube-api-access-l74ms" (OuterVolumeSpecName: "kube-api-access-l74ms") pod "ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" (UID: "ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e"). InnerVolumeSpecName "kube-api-access-l74ms". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.334862 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1301d3a7-31fd-44f4-825d-a579e4026c7a-kube-api-access-jlt8l" (OuterVolumeSpecName: "kube-api-access-jlt8l") pod "1301d3a7-31fd-44f4-825d-a579e4026c7a" (UID: "1301d3a7-31fd-44f4-825d-a579e4026c7a"). InnerVolumeSpecName "kube-api-access-jlt8l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.346743 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-549hv"] Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.355669 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-549hv"] Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.363456 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c97ca1fb-e042-4273-b024-bc9dbc806359" (UID: "c97ca1fb-e042-4273-b024-bc9dbc806359"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.406874 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1301d3a7-31fd-44f4-825d-a579e4026c7a" (UID: "1301d3a7-31fd-44f4-825d-a579e4026c7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.424215 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-operator-metrics\") pod \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.424293 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mgzn5\" (UniqueName: \"kubernetes.io/projected/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-kube-api-access-mgzn5\") pod \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.424464 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-trusted-ca\") pod \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\" (UID: \"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4\") " Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.425515 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" (UID: "ea4a1ebf-01bd-4907-a6fb-2e31e463acb4"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.427631 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8kh6\" (UniqueName: \"kubernetes.io/projected/c97ca1fb-e042-4273-b024-bc9dbc806359-kube-api-access-z8kh6\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.427851 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.427948 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlt8l\" (UniqueName: \"kubernetes.io/projected/1301d3a7-31fd-44f4-825d-a579e4026c7a-kube-api-access-jlt8l\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.428312 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.428413 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.428649 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c97ca1fb-e042-4273-b024-bc9dbc806359-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.428841 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l74ms\" (UniqueName: \"kubernetes.io/projected/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-kube-api-access-l74ms\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.428927 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.429001 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1301d3a7-31fd-44f4-825d-a579e4026c7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.429295 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-kube-api-access-mgzn5" (OuterVolumeSpecName: "kube-api-access-mgzn5") pod "ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" (UID: "ea4a1ebf-01bd-4907-a6fb-2e31e463acb4"). InnerVolumeSpecName "kube-api-access-mgzn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.430662 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" (UID: "ea4a1ebf-01bd-4907-a6fb-2e31e463acb4"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.475825 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" (UID: "ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.478492 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-66h4r"] Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.532012 4958 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.532047 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mgzn5\" (UniqueName: \"kubernetes.io/projected/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4-kube-api-access-mgzn5\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:43 crc kubenswrapper[4958]: I0320 09:06:43.532057 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.031986 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-mpjsp" event={"ID":"1301d3a7-31fd-44f4-825d-a579e4026c7a","Type":"ContainerDied","Data":"844dac9951fadca61dc09bb6fa55c3e3620fcc77cab2e921303f4e8b8f330cc1"} Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.032043 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-mpjsp" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.032071 4958 scope.go:117] "RemoveContainer" containerID="1daa1aaf3b5fe03ebea9132c909cc38da98e4a17208c0b5b1ba83ee0358929b0" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.034390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" event={"ID":"ea4a1ebf-01bd-4907-a6fb-2e31e463acb4","Type":"ContainerDied","Data":"9c72d60bc35d6f628d2db1fe380068e5dab129b6be7ed743f2d2f5bb6130d977"} Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.034415 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-2gwpt" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.036345 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" event={"ID":"36a69577-98bd-420f-b49a-f004c20de1e0","Type":"ContainerStarted","Data":"0ec631c945527cc24b0d32ed3cc77d19c9bdd7cc57c657974b27cbbfc191d9cc"} Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.036398 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" event={"ID":"36a69577-98bd-420f-b49a-f004c20de1e0","Type":"ContainerStarted","Data":"c2e507fa2d714bcc2c1d5d95235d0a820f285a8d9f87a4b1a3309334de0eebb7"} Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.036415 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.039568 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p5nh9" event={"ID":"ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e","Type":"ContainerDied","Data":"307af55e839e94ca4aa26086003cc12be08cc61758452900b3809dba41aee089"} Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.039681 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p5nh9" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.042491 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.046445 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z8j2r" event={"ID":"c97ca1fb-e042-4273-b024-bc9dbc806359","Type":"ContainerDied","Data":"628e318d42108a9b4a134e2ac237c451e8568e1574268dc323742c4d0135ffad"} Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.046576 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z8j2r" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.058212 4958 scope.go:117] "RemoveContainer" containerID="d112343654e8ece2c555f721784929b792585a044f3751aed69efac0755581df" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.071023 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-66h4r" podStartSLOduration=2.070993543 podStartE2EDuration="2.070993543s" podCreationTimestamp="2026-03-20 09:06:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:06:44.061995411 +0000 UTC m=+424.384011369" watchObservedRunningTime="2026-03-20 09:06:44.070993543 +0000 UTC m=+424.393009491" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.089458 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-mpjsp"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.092436 4958 scope.go:117] "RemoveContainer" containerID="a7b461d3196a9ec1b2875f1bff180e2af981110343219c7dc7214bcbe903a613" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.095528 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-mpjsp"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.124663 4958 scope.go:117] "RemoveContainer" containerID="6da2b3db01910ff5a949506b9f1fcd89db5d5dcbadc821a053bd820a24a7c37b" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.146041 4958 scope.go:117] "RemoveContainer" containerID="202b742be89e34126fdc698910c0c455020ed04a9bf75db6b4a611df61c176d8" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.159707 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p5nh9"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.167916 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p5nh9"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.176310 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gwpt"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.176963 4958 scope.go:117] "RemoveContainer" containerID="b0b56e981b3dca165ff19e6b74900926c1d0c14b8697e35b982049aa89a67714" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.184496 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-2gwpt"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.195132 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8j2r"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.207223 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-z8j2r"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.217375 4958 scope.go:117] "RemoveContainer" containerID="c32e251289438dca04f9f1f8bc8e949811c0f70f58ee1bc6242a9c5c9922fa4e" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.238875 4958 scope.go:117] "RemoveContainer" containerID="c7f214d447c87c57cf0d136d6a477d47b7637f0dfa344988ac59335bb40597b5" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.256899 4958 scope.go:117] "RemoveContainer" containerID="dd0b3d0163aacce5568211b8e740b7799f69206fdf5b2d578b6025241d9500e1" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.278195 4958 scope.go:117] "RemoveContainer" containerID="58e7e8c24e35be1d9a5b6c9decfcd600d5441afe9fa4da4377219fb39637ab71" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.442581 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" path="/var/lib/kubelet/pods/1301d3a7-31fd-44f4-825d-a579e4026c7a/volumes" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.443308 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" path="/var/lib/kubelet/pods/ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e/volumes" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.443965 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" path="/var/lib/kubelet/pods/c97ca1fb-e042-4273-b024-bc9dbc806359/volumes" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.447759 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" path="/var/lib/kubelet/pods/ea4a1ebf-01bd-4907-a6fb-2e31e463acb4/volumes" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.448377 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" path="/var/lib/kubelet/pods/fcb5229f-2b8f-4e6a-8542-cd03b84e9737/volumes" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.626753 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-z2tl4"] Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627108 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627124 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627140 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627147 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627158 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627167 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627179 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627188 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627197 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627203 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627212 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627219 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="extract-content" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627227 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627233 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627243 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627249 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627263 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627270 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627277 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627286 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627297 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627305 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627319 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627326 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="extract-utilities" Mar 20 09:06:44 crc kubenswrapper[4958]: E0320 09:06:44.627335 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627341 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627459 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea4a1ebf-01bd-4907-a6fb-2e31e463acb4" containerName="marketplace-operator" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627468 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab9a0cb6-f5b5-43a3-847e-2c4da47cff4e" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627475 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fcb5229f-2b8f-4e6a-8542-cd03b84e9737" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627484 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c97ca1fb-e042-4273-b024-bc9dbc806359" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.627492 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1301d3a7-31fd-44f4-825d-a579e4026c7a" containerName="registry-server" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.628343 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.631300 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.640300 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2tl4"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.768699 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98737b72-788c-4867-b476-d0723c9111d1-catalog-content\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.768824 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zqbgb\" (UniqueName: \"kubernetes.io/projected/98737b72-788c-4867-b476-d0723c9111d1-kube-api-access-zqbgb\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.768893 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98737b72-788c-4867-b476-d0723c9111d1-utilities\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.810927 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-779ld"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.813549 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.818486 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.830815 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-779ld"] Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.870874 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98737b72-788c-4867-b476-d0723c9111d1-catalog-content\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.871013 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zqbgb\" (UniqueName: \"kubernetes.io/projected/98737b72-788c-4867-b476-d0723c9111d1-kube-api-access-zqbgb\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.871047 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98737b72-788c-4867-b476-d0723c9111d1-utilities\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.871928 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/98737b72-788c-4867-b476-d0723c9111d1-catalog-content\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.872320 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/98737b72-788c-4867-b476-d0723c9111d1-utilities\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.894589 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zqbgb\" (UniqueName: \"kubernetes.io/projected/98737b72-788c-4867-b476-d0723c9111d1-kube-api-access-zqbgb\") pod \"redhat-marketplace-z2tl4\" (UID: \"98737b72-788c-4867-b476-d0723c9111d1\") " pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.957301 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.972386 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e817fe38-a7fc-4fc7-8eec-739e3c76b459-utilities\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.972458 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrd7n\" (UniqueName: \"kubernetes.io/projected/e817fe38-a7fc-4fc7-8eec-739e3c76b459-kube-api-access-mrd7n\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:44 crc kubenswrapper[4958]: I0320 09:06:44.972558 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e817fe38-a7fc-4fc7-8eec-739e3c76b459-catalog-content\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.075552 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e817fe38-a7fc-4fc7-8eec-739e3c76b459-utilities\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.076112 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrd7n\" (UniqueName: \"kubernetes.io/projected/e817fe38-a7fc-4fc7-8eec-739e3c76b459-kube-api-access-mrd7n\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.076293 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e817fe38-a7fc-4fc7-8eec-739e3c76b459-catalog-content\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.077150 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e817fe38-a7fc-4fc7-8eec-739e3c76b459-utilities\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.081991 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e817fe38-a7fc-4fc7-8eec-739e3c76b459-catalog-content\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.104729 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrd7n\" (UniqueName: \"kubernetes.io/projected/e817fe38-a7fc-4fc7-8eec-739e3c76b459-kube-api-access-mrd7n\") pod \"redhat-operators-779ld\" (UID: \"e817fe38-a7fc-4fc7-8eec-739e3c76b459\") " pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.133930 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.448262 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-z2tl4"] Mar 20 09:06:45 crc kubenswrapper[4958]: I0320 09:06:45.583275 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-779ld"] Mar 20 09:06:46 crc kubenswrapper[4958]: I0320 09:06:46.102954 4958 generic.go:334] "Generic (PLEG): container finished" podID="e817fe38-a7fc-4fc7-8eec-739e3c76b459" containerID="76c5050af3806a916992cb0bd8dff308313a065d22866373daad03f016ae4f31" exitCode=0 Mar 20 09:06:46 crc kubenswrapper[4958]: I0320 09:06:46.103024 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-779ld" event={"ID":"e817fe38-a7fc-4fc7-8eec-739e3c76b459","Type":"ContainerDied","Data":"76c5050af3806a916992cb0bd8dff308313a065d22866373daad03f016ae4f31"} Mar 20 09:06:46 crc kubenswrapper[4958]: I0320 09:06:46.103092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-779ld" event={"ID":"e817fe38-a7fc-4fc7-8eec-739e3c76b459","Type":"ContainerStarted","Data":"c9a00c9e3782b266c21c7656d8bcafe5d36abe09530f1a456a616bdb93cdfe7d"} Mar 20 09:06:46 crc kubenswrapper[4958]: I0320 09:06:46.107028 4958 generic.go:334] "Generic (PLEG): container finished" podID="98737b72-788c-4867-b476-d0723c9111d1" containerID="5c1666aa60c60349e19853dc8cf683ab99cc492649cd805d2db82b55c5e88577" exitCode=0 Mar 20 09:06:46 crc kubenswrapper[4958]: I0320 09:06:46.107161 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2tl4" event={"ID":"98737b72-788c-4867-b476-d0723c9111d1","Type":"ContainerDied","Data":"5c1666aa60c60349e19853dc8cf683ab99cc492649cd805d2db82b55c5e88577"} Mar 20 09:06:46 crc kubenswrapper[4958]: I0320 09:06:46.107238 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2tl4" event={"ID":"98737b72-788c-4867-b476-d0723c9111d1","Type":"ContainerStarted","Data":"88aff456505191b74013b2d727e53d5d1b21d50f99d4b2ddf5aec3567b850374"} Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.011988 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-hghdm"] Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.013918 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.020072 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.023960 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hghdm"] Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.113286 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-utilities\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.113345 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-catalog-content\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.113517 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4hdpq\" (UniqueName: \"kubernetes.io/projected/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-kube-api-access-4hdpq\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.207866 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-rbv9h"] Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.210940 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.213224 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.214821 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4hdpq\" (UniqueName: \"kubernetes.io/projected/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-kube-api-access-4hdpq\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.214919 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-utilities\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.214952 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-catalog-content\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.215565 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-utilities\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.215650 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-catalog-content\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.218733 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbv9h"] Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.239788 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4hdpq\" (UniqueName: \"kubernetes.io/projected/75f0af6a-35bc-4beb-bd7e-4a7c1c37155d-kube-api-access-4hdpq\") pod \"certified-operators-hghdm\" (UID: \"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d\") " pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.317022 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42csf\" (UniqueName: \"kubernetes.io/projected/0930a6b5-25c2-441d-8204-b483adf7da51-kube-api-access-42csf\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.317093 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-catalog-content\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.317149 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-utilities\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.419109 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42csf\" (UniqueName: \"kubernetes.io/projected/0930a6b5-25c2-441d-8204-b483adf7da51-kube-api-access-42csf\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.419161 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-catalog-content\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.419232 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-utilities\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.419741 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-utilities\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.419970 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-catalog-content\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.438515 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.439235 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42csf\" (UniqueName: \"kubernetes.io/projected/0930a6b5-25c2-441d-8204-b483adf7da51-kube-api-access-42csf\") pod \"community-operators-rbv9h\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.529162 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.870344 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-hghdm"] Mar 20 09:06:47 crc kubenswrapper[4958]: I0320 09:06:47.970512 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-rbv9h"] Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.123024 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerStarted","Data":"3ead8d85be346e65114969c2b1885ef2f67063d8662920fdfa2e3ceb7a16db58"} Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.124820 4958 generic.go:334] "Generic (PLEG): container finished" podID="75f0af6a-35bc-4beb-bd7e-4a7c1c37155d" containerID="8c6699a7e88067e75a27cbf67f3f1a42b01141fe4bb61409c920a9a957a97932" exitCode=0 Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.124897 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghdm" event={"ID":"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d","Type":"ContainerDied","Data":"8c6699a7e88067e75a27cbf67f3f1a42b01141fe4bb61409c920a9a957a97932"} Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.124932 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghdm" event={"ID":"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d","Type":"ContainerStarted","Data":"24f052371080802d6812e0d283c23d43618ed22ec99b6685ee7e8f48e34b12e2"} Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.126868 4958 generic.go:334] "Generic (PLEG): container finished" podID="98737b72-788c-4867-b476-d0723c9111d1" containerID="fedacdab8f6dd4fa6dbfc6566f0ce6adc9bb081819dba3876bf4496b275aadeb" exitCode=0 Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.126909 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2tl4" event={"ID":"98737b72-788c-4867-b476-d0723c9111d1","Type":"ContainerDied","Data":"fedacdab8f6dd4fa6dbfc6566f0ce6adc9bb081819dba3876bf4496b275aadeb"} Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.128756 4958 generic.go:334] "Generic (PLEG): container finished" podID="e817fe38-a7fc-4fc7-8eec-739e3c76b459" containerID="6f6a5525f66a9e836c393192505d1a89bdad21a89123a4c8873ea3e2febd3418" exitCode=0 Mar 20 09:06:48 crc kubenswrapper[4958]: I0320 09:06:48.128873 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-779ld" event={"ID":"e817fe38-a7fc-4fc7-8eec-739e3c76b459","Type":"ContainerDied","Data":"6f6a5525f66a9e836c393192505d1a89bdad21a89123a4c8873ea3e2febd3418"} Mar 20 09:06:49 crc kubenswrapper[4958]: I0320 09:06:49.138542 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-z2tl4" event={"ID":"98737b72-788c-4867-b476-d0723c9111d1","Type":"ContainerStarted","Data":"43074894aa2d7af8f9bba8db470c685f6ab31294bacd6b7f2b4ef02bea493d53"} Mar 20 09:06:49 crc kubenswrapper[4958]: I0320 09:06:49.141311 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-779ld" event={"ID":"e817fe38-a7fc-4fc7-8eec-739e3c76b459","Type":"ContainerStarted","Data":"3d6f805d16632561d49bc4504a87ee4dc1e3e38a96a47c269631d281bd405df4"} Mar 20 09:06:49 crc kubenswrapper[4958]: I0320 09:06:49.144540 4958 generic.go:334] "Generic (PLEG): container finished" podID="0930a6b5-25c2-441d-8204-b483adf7da51" containerID="f2e9f6075254cc62fe265776201f32342fca72830925d1cabe65d30e0cd6fcb8" exitCode=0 Mar 20 09:06:49 crc kubenswrapper[4958]: I0320 09:06:49.144673 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerDied","Data":"f2e9f6075254cc62fe265776201f32342fca72830925d1cabe65d30e0cd6fcb8"} Mar 20 09:06:49 crc kubenswrapper[4958]: I0320 09:06:49.176985 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-z2tl4" podStartSLOduration=2.536344738 podStartE2EDuration="5.176962925s" podCreationTimestamp="2026-03-20 09:06:44 +0000 UTC" firstStartedPulling="2026-03-20 09:06:46.109224119 +0000 UTC m=+426.431240077" lastFinishedPulling="2026-03-20 09:06:48.749842306 +0000 UTC m=+429.071858264" observedRunningTime="2026-03-20 09:06:49.173118712 +0000 UTC m=+429.495134670" watchObservedRunningTime="2026-03-20 09:06:49.176962925 +0000 UTC m=+429.498978883" Mar 20 09:06:49 crc kubenswrapper[4958]: I0320 09:06:49.218941 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-779ld" podStartSLOduration=2.698038834 podStartE2EDuration="5.218915921s" podCreationTimestamp="2026-03-20 09:06:44 +0000 UTC" firstStartedPulling="2026-03-20 09:06:46.104926613 +0000 UTC m=+426.426942571" lastFinishedPulling="2026-03-20 09:06:48.6258037 +0000 UTC m=+428.947819658" observedRunningTime="2026-03-20 09:06:49.213221035 +0000 UTC m=+429.535236993" watchObservedRunningTime="2026-03-20 09:06:49.218915921 +0000 UTC m=+429.540931879" Mar 20 09:06:50 crc kubenswrapper[4958]: I0320 09:06:50.153568 4958 generic.go:334] "Generic (PLEG): container finished" podID="75f0af6a-35bc-4beb-bd7e-4a7c1c37155d" containerID="ce6db2b45b90c1a87dd9962a07898f3b7c103ff99d177535aadf942657676bba" exitCode=0 Mar 20 09:06:50 crc kubenswrapper[4958]: I0320 09:06:50.153685 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghdm" event={"ID":"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d","Type":"ContainerDied","Data":"ce6db2b45b90c1a87dd9962a07898f3b7c103ff99d177535aadf942657676bba"} Mar 20 09:06:50 crc kubenswrapper[4958]: I0320 09:06:50.158331 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerStarted","Data":"c5a79b86bbee78c6d6b239b1f7e67a6452715e2acc9abc8ac79262809cc522a0"} Mar 20 09:06:51 crc kubenswrapper[4958]: I0320 09:06:51.166731 4958 generic.go:334] "Generic (PLEG): container finished" podID="0930a6b5-25c2-441d-8204-b483adf7da51" containerID="c5a79b86bbee78c6d6b239b1f7e67a6452715e2acc9abc8ac79262809cc522a0" exitCode=0 Mar 20 09:06:51 crc kubenswrapper[4958]: I0320 09:06:51.166859 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerDied","Data":"c5a79b86bbee78c6d6b239b1f7e67a6452715e2acc9abc8ac79262809cc522a0"} Mar 20 09:06:51 crc kubenswrapper[4958]: I0320 09:06:51.170334 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-hghdm" event={"ID":"75f0af6a-35bc-4beb-bd7e-4a7c1c37155d","Type":"ContainerStarted","Data":"a2057e15ffa61de3255f51becb344592e22ecfa9b32d996bec3e691951d4611e"} Mar 20 09:06:51 crc kubenswrapper[4958]: I0320 09:06:51.214467 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-hghdm" podStartSLOduration=2.386557484 podStartE2EDuration="5.214409517s" podCreationTimestamp="2026-03-20 09:06:46 +0000 UTC" firstStartedPulling="2026-03-20 09:06:48.127863611 +0000 UTC m=+428.449879569" lastFinishedPulling="2026-03-20 09:06:50.955715644 +0000 UTC m=+431.277731602" observedRunningTime="2026-03-20 09:06:51.211227284 +0000 UTC m=+431.533243242" watchObservedRunningTime="2026-03-20 09:06:51.214409517 +0000 UTC m=+431.536425495" Mar 20 09:06:52 crc kubenswrapper[4958]: I0320 09:06:52.180817 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerStarted","Data":"0b2c700278493776cb0b09fd3e4fb34a7c6921b51536a6ac28817cc0a89dfc84"} Mar 20 09:06:52 crc kubenswrapper[4958]: I0320 09:06:52.205822 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-rbv9h" podStartSLOduration=2.696499515 podStartE2EDuration="5.205795533s" podCreationTimestamp="2026-03-20 09:06:47 +0000 UTC" firstStartedPulling="2026-03-20 09:06:49.149725389 +0000 UTC m=+429.471741347" lastFinishedPulling="2026-03-20 09:06:51.659021407 +0000 UTC m=+431.981037365" observedRunningTime="2026-03-20 09:06:52.20432432 +0000 UTC m=+432.526340288" watchObservedRunningTime="2026-03-20 09:06:52.205795533 +0000 UTC m=+432.527811491" Mar 20 09:06:54 crc kubenswrapper[4958]: I0320 09:06:54.958484 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:54 crc kubenswrapper[4958]: I0320 09:06:54.959548 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:55 crc kubenswrapper[4958]: I0320 09:06:55.007583 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:55 crc kubenswrapper[4958]: I0320 09:06:55.134503 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:55 crc kubenswrapper[4958]: I0320 09:06:55.135046 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:06:55 crc kubenswrapper[4958]: I0320 09:06:55.237802 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-z2tl4" Mar 20 09:06:56 crc kubenswrapper[4958]: I0320 09:06:56.180521 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-779ld" podUID="e817fe38-a7fc-4fc7-8eec-739e3c76b459" containerName="registry-server" probeResult="failure" output=< Mar 20 09:06:56 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Mar 20 09:06:56 crc kubenswrapper[4958]: > Mar 20 09:06:56 crc kubenswrapper[4958]: I0320 09:06:56.524114 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:06:56 crc kubenswrapper[4958]: I0320 09:06:56.524238 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:06:57 crc kubenswrapper[4958]: I0320 09:06:57.439248 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:57 crc kubenswrapper[4958]: I0320 09:06:57.439304 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:57 crc kubenswrapper[4958]: I0320 09:06:57.485571 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:06:57 crc kubenswrapper[4958]: I0320 09:06:57.530005 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:57 crc kubenswrapper[4958]: I0320 09:06:57.530085 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:57 crc kubenswrapper[4958]: I0320 09:06:57.569548 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:58 crc kubenswrapper[4958]: I0320 09:06:58.259171 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:06:58 crc kubenswrapper[4958]: I0320 09:06:58.260889 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-hghdm" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.137136 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" podUID="7fc6b17f-3483-409e-aee4-011ce5afd4c2" containerName="registry" containerID="cri-o://f24ac4694c5b9dbd1a9eb6564ebeee67309d3f4d13e26f98ac55148eea17fa12" gracePeriod=30 Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.261532 4958 generic.go:334] "Generic (PLEG): container finished" podID="7fc6b17f-3483-409e-aee4-011ce5afd4c2" containerID="f24ac4694c5b9dbd1a9eb6564ebeee67309d3f4d13e26f98ac55148eea17fa12" exitCode=0 Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.261588 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" event={"ID":"7fc6b17f-3483-409e-aee4-011ce5afd4c2","Type":"ContainerDied","Data":"f24ac4694c5b9dbd1a9eb6564ebeee67309d3f4d13e26f98ac55148eea17fa12"} Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.624773 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.789022 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7fc6b17f-3483-409e-aee4-011ce5afd4c2-ca-trust-extracted\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.789664 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-shjjt\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-kube-api-access-shjjt\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.789709 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-bound-sa-token\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.789766 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7fc6b17f-3483-409e-aee4-011ce5afd4c2-installation-pull-secrets\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.789805 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-trusted-ca\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.789856 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-tls\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.790117 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.790178 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-certificates\") pod \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\" (UID: \"7fc6b17f-3483-409e-aee4-011ce5afd4c2\") " Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.791275 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.793405 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.798325 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.798377 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-kube-api-access-shjjt" (OuterVolumeSpecName: "kube-api-access-shjjt") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "kube-api-access-shjjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.800205 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.800434 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7fc6b17f-3483-409e-aee4-011ce5afd4c2-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:07:02 crc kubenswrapper[4958]: I0320 09:07:02.806810 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.821488 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7fc6b17f-3483-409e-aee4-011ce5afd4c2-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "7fc6b17f-3483-409e-aee4-011ce5afd4c2" (UID: "7fc6b17f-3483-409e-aee4-011ce5afd4c2"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891703 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-shjjt\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-kube-api-access-shjjt\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891748 4958 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891759 4958 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/7fc6b17f-3483-409e-aee4-011ce5afd4c2-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891770 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891781 4958 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891789 4958 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/7fc6b17f-3483-409e-aee4-011ce5afd4c2-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:02.891798 4958 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/7fc6b17f-3483-409e-aee4-011ce5afd4c2-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:03.269861 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" event={"ID":"7fc6b17f-3483-409e-aee4-011ce5afd4c2","Type":"ContainerDied","Data":"ab63a23295153380160611432f61a3d1bd726635050a765f6700f8ca28a4194d"} Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:03.269927 4958 scope.go:117] "RemoveContainer" containerID="f24ac4694c5b9dbd1a9eb6564ebeee67309d3f4d13e26f98ac55148eea17fa12" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:03.270052 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-flhr9" Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:03.308925 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-flhr9"] Mar 20 09:07:03 crc kubenswrapper[4958]: I0320 09:07:03.312908 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-flhr9"] Mar 20 09:07:04 crc kubenswrapper[4958]: I0320 09:07:04.450076 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7fc6b17f-3483-409e-aee4-011ce5afd4c2" path="/var/lib/kubelet/pods/7fc6b17f-3483-409e-aee4-011ce5afd4c2/volumes" Mar 20 09:07:05 crc kubenswrapper[4958]: I0320 09:07:05.182703 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:07:05 crc kubenswrapper[4958]: I0320 09:07:05.232582 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-779ld" Mar 20 09:07:26 crc kubenswrapper[4958]: I0320 09:07:26.521578 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:07:26 crc kubenswrapper[4958]: I0320 09:07:26.522291 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:07:26 crc kubenswrapper[4958]: I0320 09:07:26.522353 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:07:26 crc kubenswrapper[4958]: I0320 09:07:26.522986 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"58ed31675b41c4d2716ac9083f69cda61ce6ef10102045cfe1b828ff5cb4d12f"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:07:26 crc kubenswrapper[4958]: I0320 09:07:26.523048 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://58ed31675b41c4d2716ac9083f69cda61ce6ef10102045cfe1b828ff5cb4d12f" gracePeriod=600 Mar 20 09:07:27 crc kubenswrapper[4958]: I0320 09:07:27.443583 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="58ed31675b41c4d2716ac9083f69cda61ce6ef10102045cfe1b828ff5cb4d12f" exitCode=0 Mar 20 09:07:27 crc kubenswrapper[4958]: I0320 09:07:27.443660 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"58ed31675b41c4d2716ac9083f69cda61ce6ef10102045cfe1b828ff5cb4d12f"} Mar 20 09:07:27 crc kubenswrapper[4958]: I0320 09:07:27.444187 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"b4b8fd112ac49f49535801cfe26058a0f5192af2f7e3f7bd074e82803f42be38"} Mar 20 09:07:27 crc kubenswrapper[4958]: I0320 09:07:27.444222 4958 scope.go:117] "RemoveContainer" containerID="88f3e5ed1f7de48086e130f2e2668dc14d86e2fd75e1a4d3599509b540b06711" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.182650 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566628-mrzx6"] Mar 20 09:08:00 crc kubenswrapper[4958]: E0320 09:08:00.183307 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7fc6b17f-3483-409e-aee4-011ce5afd4c2" containerName="registry" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.183321 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7fc6b17f-3483-409e-aee4-011ce5afd4c2" containerName="registry" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.183416 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7fc6b17f-3483-409e-aee4-011ce5afd4c2" containerName="registry" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.183848 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.187136 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.187732 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.190669 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.208877 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-mrzx6"] Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.271008 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vdkqh\" (UniqueName: \"kubernetes.io/projected/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e-kube-api-access-vdkqh\") pod \"auto-csr-approver-29566628-mrzx6\" (UID: \"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e\") " pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.372535 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vdkqh\" (UniqueName: \"kubernetes.io/projected/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e-kube-api-access-vdkqh\") pod \"auto-csr-approver-29566628-mrzx6\" (UID: \"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e\") " pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.407711 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vdkqh\" (UniqueName: \"kubernetes.io/projected/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e-kube-api-access-vdkqh\") pod \"auto-csr-approver-29566628-mrzx6\" (UID: \"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e\") " pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.499845 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.759976 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-mrzx6"] Mar 20 09:08:00 crc kubenswrapper[4958]: I0320 09:08:00.783730 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:08:01 crc kubenswrapper[4958]: I0320 09:08:01.687174 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" event={"ID":"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e","Type":"ContainerStarted","Data":"2af442d4621b99e0f421d625c2a640c711b2149318a30542f2a18edda9c7f2fb"} Mar 20 09:08:03 crc kubenswrapper[4958]: I0320 09:08:03.704725 4958 generic.go:334] "Generic (PLEG): container finished" podID="b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e" containerID="8c4d4f89fc944bca692270c70c54a731a779528750f7c103e3d829a11a136518" exitCode=0 Mar 20 09:08:03 crc kubenswrapper[4958]: I0320 09:08:03.705217 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" event={"ID":"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e","Type":"ContainerDied","Data":"8c4d4f89fc944bca692270c70c54a731a779528750f7c103e3d829a11a136518"} Mar 20 09:08:04 crc kubenswrapper[4958]: I0320 09:08:04.969874 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:05 crc kubenswrapper[4958]: I0320 09:08:05.040303 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vdkqh\" (UniqueName: \"kubernetes.io/projected/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e-kube-api-access-vdkqh\") pod \"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e\" (UID: \"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e\") " Mar 20 09:08:05 crc kubenswrapper[4958]: I0320 09:08:05.047915 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e-kube-api-access-vdkqh" (OuterVolumeSpecName: "kube-api-access-vdkqh") pod "b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e" (UID: "b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e"). InnerVolumeSpecName "kube-api-access-vdkqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:08:05 crc kubenswrapper[4958]: I0320 09:08:05.142224 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vdkqh\" (UniqueName: \"kubernetes.io/projected/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e-kube-api-access-vdkqh\") on node \"crc\" DevicePath \"\"" Mar 20 09:08:05 crc kubenswrapper[4958]: I0320 09:08:05.724238 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" event={"ID":"b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e","Type":"ContainerDied","Data":"2af442d4621b99e0f421d625c2a640c711b2149318a30542f2a18edda9c7f2fb"} Mar 20 09:08:05 crc kubenswrapper[4958]: I0320 09:08:05.724303 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2af442d4621b99e0f421d625c2a640c711b2149318a30542f2a18edda9c7f2fb" Mar 20 09:08:05 crc kubenswrapper[4958]: I0320 09:08:05.724385 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566628-mrzx6" Mar 20 09:08:06 crc kubenswrapper[4958]: I0320 09:08:06.024435 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-xd9xt"] Mar 20 09:08:06 crc kubenswrapper[4958]: I0320 09:08:06.028964 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566622-xd9xt"] Mar 20 09:08:06 crc kubenswrapper[4958]: I0320 09:08:06.445024 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="375c7798-d728-48b0-ac0d-27ba8f57a393" path="/var/lib/kubelet/pods/375c7798-d728-48b0-ac0d-27ba8f57a393/volumes" Mar 20 09:09:26 crc kubenswrapper[4958]: I0320 09:09:26.521795 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:09:26 crc kubenswrapper[4958]: I0320 09:09:26.522584 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:09:42 crc kubenswrapper[4958]: I0320 09:09:42.772963 4958 scope.go:117] "RemoveContainer" containerID="518ef97b7142a906c9a60e6043be113540c5683a89c2b005ab6356d5fae86135" Mar 20 09:09:56 crc kubenswrapper[4958]: I0320 09:09:56.521311 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:09:56 crc kubenswrapper[4958]: I0320 09:09:56.521989 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.145816 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566630-bz4vf"] Mar 20 09:10:00 crc kubenswrapper[4958]: E0320 09:10:00.146468 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e" containerName="oc" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.146488 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e" containerName="oc" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.146717 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e" containerName="oc" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.147306 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.155375 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.156445 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.156683 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.158564 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-bz4vf"] Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.259038 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dcbnv\" (UniqueName: \"kubernetes.io/projected/5413de9b-2a29-40e8-ace1-8bcd650af14a-kube-api-access-dcbnv\") pod \"auto-csr-approver-29566630-bz4vf\" (UID: \"5413de9b-2a29-40e8-ace1-8bcd650af14a\") " pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.360354 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dcbnv\" (UniqueName: \"kubernetes.io/projected/5413de9b-2a29-40e8-ace1-8bcd650af14a-kube-api-access-dcbnv\") pod \"auto-csr-approver-29566630-bz4vf\" (UID: \"5413de9b-2a29-40e8-ace1-8bcd650af14a\") " pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.382478 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dcbnv\" (UniqueName: \"kubernetes.io/projected/5413de9b-2a29-40e8-ace1-8bcd650af14a-kube-api-access-dcbnv\") pod \"auto-csr-approver-29566630-bz4vf\" (UID: \"5413de9b-2a29-40e8-ace1-8bcd650af14a\") " pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.525688 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:00 crc kubenswrapper[4958]: I0320 09:10:00.766224 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-bz4vf"] Mar 20 09:10:01 crc kubenswrapper[4958]: I0320 09:10:01.556652 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" event={"ID":"5413de9b-2a29-40e8-ace1-8bcd650af14a","Type":"ContainerStarted","Data":"b9a35cae3cd36efca6efee1813a8323e9e2f1eb5e9fbd0639ec620f67abc9777"} Mar 20 09:10:02 crc kubenswrapper[4958]: I0320 09:10:02.568511 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" event={"ID":"5413de9b-2a29-40e8-ace1-8bcd650af14a","Type":"ContainerStarted","Data":"eaa790f1e58f13748a111e56b30e665d6c527510bd44d967abb6893d5871028e"} Mar 20 09:10:02 crc kubenswrapper[4958]: I0320 09:10:02.589846 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" podStartSLOduration=1.20851554 podStartE2EDuration="2.589806745s" podCreationTimestamp="2026-03-20 09:10:00 +0000 UTC" firstStartedPulling="2026-03-20 09:10:00.77634688 +0000 UTC m=+621.098362838" lastFinishedPulling="2026-03-20 09:10:02.157638045 +0000 UTC m=+622.479654043" observedRunningTime="2026-03-20 09:10:02.587516432 +0000 UTC m=+622.909532420" watchObservedRunningTime="2026-03-20 09:10:02.589806745 +0000 UTC m=+622.911822693" Mar 20 09:10:03 crc kubenswrapper[4958]: I0320 09:10:03.578226 4958 generic.go:334] "Generic (PLEG): container finished" podID="5413de9b-2a29-40e8-ace1-8bcd650af14a" containerID="eaa790f1e58f13748a111e56b30e665d6c527510bd44d967abb6893d5871028e" exitCode=0 Mar 20 09:10:03 crc kubenswrapper[4958]: I0320 09:10:03.578349 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" event={"ID":"5413de9b-2a29-40e8-ace1-8bcd650af14a","Type":"ContainerDied","Data":"eaa790f1e58f13748a111e56b30e665d6c527510bd44d967abb6893d5871028e"} Mar 20 09:10:04 crc kubenswrapper[4958]: I0320 09:10:04.834768 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:04 crc kubenswrapper[4958]: I0320 09:10:04.920786 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dcbnv\" (UniqueName: \"kubernetes.io/projected/5413de9b-2a29-40e8-ace1-8bcd650af14a-kube-api-access-dcbnv\") pod \"5413de9b-2a29-40e8-ace1-8bcd650af14a\" (UID: \"5413de9b-2a29-40e8-ace1-8bcd650af14a\") " Mar 20 09:10:04 crc kubenswrapper[4958]: I0320 09:10:04.927769 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5413de9b-2a29-40e8-ace1-8bcd650af14a-kube-api-access-dcbnv" (OuterVolumeSpecName: "kube-api-access-dcbnv") pod "5413de9b-2a29-40e8-ace1-8bcd650af14a" (UID: "5413de9b-2a29-40e8-ace1-8bcd650af14a"). InnerVolumeSpecName "kube-api-access-dcbnv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:10:05 crc kubenswrapper[4958]: I0320 09:10:05.022841 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dcbnv\" (UniqueName: \"kubernetes.io/projected/5413de9b-2a29-40e8-ace1-8bcd650af14a-kube-api-access-dcbnv\") on node \"crc\" DevicePath \"\"" Mar 20 09:10:05 crc kubenswrapper[4958]: I0320 09:10:05.595890 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" event={"ID":"5413de9b-2a29-40e8-ace1-8bcd650af14a","Type":"ContainerDied","Data":"b9a35cae3cd36efca6efee1813a8323e9e2f1eb5e9fbd0639ec620f67abc9777"} Mar 20 09:10:05 crc kubenswrapper[4958]: I0320 09:10:05.595958 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566630-bz4vf" Mar 20 09:10:05 crc kubenswrapper[4958]: I0320 09:10:05.595968 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9a35cae3cd36efca6efee1813a8323e9e2f1eb5e9fbd0639ec620f67abc9777" Mar 20 09:10:05 crc kubenswrapper[4958]: I0320 09:10:05.664005 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-gtbp8"] Mar 20 09:10:05 crc kubenswrapper[4958]: I0320 09:10:05.667784 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566624-gtbp8"] Mar 20 09:10:06 crc kubenswrapper[4958]: I0320 09:10:06.447551 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2a79103-8b2b-4ac4-88b0-e03a82ead6ab" path="/var/lib/kubelet/pods/a2a79103-8b2b-4ac4-88b0-e03a82ead6ab/volumes" Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.521675 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.522393 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.522477 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.523515 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b4b8fd112ac49f49535801cfe26058a0f5192af2f7e3f7bd074e82803f42be38"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.523644 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://b4b8fd112ac49f49535801cfe26058a0f5192af2f7e3f7bd074e82803f42be38" gracePeriod=600 Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.741670 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="b4b8fd112ac49f49535801cfe26058a0f5192af2f7e3f7bd074e82803f42be38" exitCode=0 Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.741755 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"b4b8fd112ac49f49535801cfe26058a0f5192af2f7e3f7bd074e82803f42be38"} Mar 20 09:10:26 crc kubenswrapper[4958]: I0320 09:10:26.742121 4958 scope.go:117] "RemoveContainer" containerID="58ed31675b41c4d2716ac9083f69cda61ce6ef10102045cfe1b828ff5cb4d12f" Mar 20 09:10:27 crc kubenswrapper[4958]: I0320 09:10:27.753378 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"cddc3aaf749f620c4810fa0b2192721051e7b180c369b36b46b439825fe97a42"} Mar 20 09:10:42 crc kubenswrapper[4958]: I0320 09:10:42.842538 4958 scope.go:117] "RemoveContainer" containerID="e924a73bca1630d3b50cbb2a554091a99a53e0141904466e9ffe481daed22d71" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.147726 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566632-nvgj8"] Mar 20 09:12:00 crc kubenswrapper[4958]: E0320 09:12:00.148732 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5413de9b-2a29-40e8-ace1-8bcd650af14a" containerName="oc" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.148754 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="5413de9b-2a29-40e8-ace1-8bcd650af14a" containerName="oc" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.148924 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="5413de9b-2a29-40e8-ace1-8bcd650af14a" containerName="oc" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.149539 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.154780 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.155105 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.155227 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.170298 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-nvgj8"] Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.182457 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-788t2\" (UniqueName: \"kubernetes.io/projected/65abaa7b-f291-4255-b84c-29352c3e6ea0-kube-api-access-788t2\") pod \"auto-csr-approver-29566632-nvgj8\" (UID: \"65abaa7b-f291-4255-b84c-29352c3e6ea0\") " pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.283761 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-788t2\" (UniqueName: \"kubernetes.io/projected/65abaa7b-f291-4255-b84c-29352c3e6ea0-kube-api-access-788t2\") pod \"auto-csr-approver-29566632-nvgj8\" (UID: \"65abaa7b-f291-4255-b84c-29352c3e6ea0\") " pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.303274 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-788t2\" (UniqueName: \"kubernetes.io/projected/65abaa7b-f291-4255-b84c-29352c3e6ea0-kube-api-access-788t2\") pod \"auto-csr-approver-29566632-nvgj8\" (UID: \"65abaa7b-f291-4255-b84c-29352c3e6ea0\") " pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.484660 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:00 crc kubenswrapper[4958]: I0320 09:12:00.686783 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-nvgj8"] Mar 20 09:12:01 crc kubenswrapper[4958]: I0320 09:12:01.462242 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" event={"ID":"65abaa7b-f291-4255-b84c-29352c3e6ea0","Type":"ContainerStarted","Data":"1d09cac851d6775d6d4ab769d129b4151ae6ab9cfa0b686c80b2c3166f85fe39"} Mar 20 09:12:02 crc kubenswrapper[4958]: I0320 09:12:02.473624 4958 generic.go:334] "Generic (PLEG): container finished" podID="65abaa7b-f291-4255-b84c-29352c3e6ea0" containerID="345d2342735db1e9c95407176c092de85b7fbb08e026fc7f81f9165c146d8d53" exitCode=0 Mar 20 09:12:02 crc kubenswrapper[4958]: I0320 09:12:02.473893 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" event={"ID":"65abaa7b-f291-4255-b84c-29352c3e6ea0","Type":"ContainerDied","Data":"345d2342735db1e9c95407176c092de85b7fbb08e026fc7f81f9165c146d8d53"} Mar 20 09:12:03 crc kubenswrapper[4958]: I0320 09:12:03.692496 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:03 crc kubenswrapper[4958]: I0320 09:12:03.729614 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-788t2\" (UniqueName: \"kubernetes.io/projected/65abaa7b-f291-4255-b84c-29352c3e6ea0-kube-api-access-788t2\") pod \"65abaa7b-f291-4255-b84c-29352c3e6ea0\" (UID: \"65abaa7b-f291-4255-b84c-29352c3e6ea0\") " Mar 20 09:12:03 crc kubenswrapper[4958]: I0320 09:12:03.739798 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65abaa7b-f291-4255-b84c-29352c3e6ea0-kube-api-access-788t2" (OuterVolumeSpecName: "kube-api-access-788t2") pod "65abaa7b-f291-4255-b84c-29352c3e6ea0" (UID: "65abaa7b-f291-4255-b84c-29352c3e6ea0"). InnerVolumeSpecName "kube-api-access-788t2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:12:03 crc kubenswrapper[4958]: I0320 09:12:03.831506 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-788t2\" (UniqueName: \"kubernetes.io/projected/65abaa7b-f291-4255-b84c-29352c3e6ea0-kube-api-access-788t2\") on node \"crc\" DevicePath \"\"" Mar 20 09:12:04 crc kubenswrapper[4958]: I0320 09:12:04.487304 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" event={"ID":"65abaa7b-f291-4255-b84c-29352c3e6ea0","Type":"ContainerDied","Data":"1d09cac851d6775d6d4ab769d129b4151ae6ab9cfa0b686c80b2c3166f85fe39"} Mar 20 09:12:04 crc kubenswrapper[4958]: I0320 09:12:04.487341 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1d09cac851d6775d6d4ab769d129b4151ae6ab9cfa0b686c80b2c3166f85fe39" Mar 20 09:12:04 crc kubenswrapper[4958]: I0320 09:12:04.487387 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566632-nvgj8" Mar 20 09:12:04 crc kubenswrapper[4958]: I0320 09:12:04.751271 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-k6brk"] Mar 20 09:12:04 crc kubenswrapper[4958]: I0320 09:12:04.757122 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566626-k6brk"] Mar 20 09:12:06 crc kubenswrapper[4958]: I0320 09:12:06.446186 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c37025e7-c9ef-4f2b-bddd-fe015cb30722" path="/var/lib/kubelet/pods/c37025e7-c9ef-4f2b-bddd-fe015cb30722/volumes" Mar 20 09:12:26 crc kubenswrapper[4958]: I0320 09:12:26.521555 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:12:26 crc kubenswrapper[4958]: I0320 09:12:26.522491 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:12:42 crc kubenswrapper[4958]: I0320 09:12:42.928483 4958 scope.go:117] "RemoveContainer" containerID="fc2bb1acaf8b13cd480fc90bd5409f5f4e2efab85cee97a0a77c863d31245fa2" Mar 20 09:12:50 crc kubenswrapper[4958]: I0320 09:12:50.806567 4958 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 09:12:56 crc kubenswrapper[4958]: I0320 09:12:56.521846 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:12:56 crc kubenswrapper[4958]: I0320 09:12:56.522717 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.055423 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-67r7n"] Mar 20 09:13:16 crc kubenswrapper[4958]: E0320 09:13:16.059892 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65abaa7b-f291-4255-b84c-29352c3e6ea0" containerName="oc" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.059914 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="65abaa7b-f291-4255-b84c-29352c3e6ea0" containerName="oc" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.060044 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="65abaa7b-f291-4255-b84c-29352c3e6ea0" containerName="oc" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.060562 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.063205 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-wlk9j" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.063279 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.063439 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.071078 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-mgmxx"] Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.072012 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mgmxx" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.077062 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-5nr7r" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.085400 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-67r7n"] Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.098403 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mgmxx"] Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.102385 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2xx4x"] Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.104127 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.107619 4958 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-698wp" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.120848 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2xx4x"] Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.200863 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fs7cj\" (UniqueName: \"kubernetes.io/projected/533c37c3-c235-4cc8-9937-96afff9fe513-kube-api-access-fs7cj\") pod \"cert-manager-cainjector-cf98fcc89-67r7n\" (UID: \"533c37c3-c235-4cc8-9937-96afff9fe513\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.201723 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrh2q\" (UniqueName: \"kubernetes.io/projected/5f1f6ba4-f472-4abb-a53d-72e17ac83d43-kube-api-access-zrh2q\") pod \"cert-manager-858654f9db-mgmxx\" (UID: \"5f1f6ba4-f472-4abb-a53d-72e17ac83d43\") " pod="cert-manager/cert-manager-858654f9db-mgmxx" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.303034 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqxxt\" (UniqueName: \"kubernetes.io/projected/46a3cd52-9d0b-48a4-bf54-39fb49633e56-kube-api-access-vqxxt\") pod \"cert-manager-webhook-687f57d79b-2xx4x\" (UID: \"46a3cd52-9d0b-48a4-bf54-39fb49633e56\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.303128 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fs7cj\" (UniqueName: \"kubernetes.io/projected/533c37c3-c235-4cc8-9937-96afff9fe513-kube-api-access-fs7cj\") pod \"cert-manager-cainjector-cf98fcc89-67r7n\" (UID: \"533c37c3-c235-4cc8-9937-96afff9fe513\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.303218 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrh2q\" (UniqueName: \"kubernetes.io/projected/5f1f6ba4-f472-4abb-a53d-72e17ac83d43-kube-api-access-zrh2q\") pod \"cert-manager-858654f9db-mgmxx\" (UID: \"5f1f6ba4-f472-4abb-a53d-72e17ac83d43\") " pod="cert-manager/cert-manager-858654f9db-mgmxx" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.326537 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fs7cj\" (UniqueName: \"kubernetes.io/projected/533c37c3-c235-4cc8-9937-96afff9fe513-kube-api-access-fs7cj\") pod \"cert-manager-cainjector-cf98fcc89-67r7n\" (UID: \"533c37c3-c235-4cc8-9937-96afff9fe513\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.330867 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrh2q\" (UniqueName: \"kubernetes.io/projected/5f1f6ba4-f472-4abb-a53d-72e17ac83d43-kube-api-access-zrh2q\") pod \"cert-manager-858654f9db-mgmxx\" (UID: \"5f1f6ba4-f472-4abb-a53d-72e17ac83d43\") " pod="cert-manager/cert-manager-858654f9db-mgmxx" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.389246 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.398200 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-mgmxx" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.404943 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vqxxt\" (UniqueName: \"kubernetes.io/projected/46a3cd52-9d0b-48a4-bf54-39fb49633e56-kube-api-access-vqxxt\") pod \"cert-manager-webhook-687f57d79b-2xx4x\" (UID: \"46a3cd52-9d0b-48a4-bf54-39fb49633e56\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.449087 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqxxt\" (UniqueName: \"kubernetes.io/projected/46a3cd52-9d0b-48a4-bf54-39fb49633e56-kube-api-access-vqxxt\") pod \"cert-manager-webhook-687f57d79b-2xx4x\" (UID: \"46a3cd52-9d0b-48a4-bf54-39fb49633e56\") " pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.612634 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-67r7n"] Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.624208 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.663804 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-mgmxx"] Mar 20 09:13:16 crc kubenswrapper[4958]: W0320 09:13:16.676290 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5f1f6ba4_f472_4abb_a53d_72e17ac83d43.slice/crio-b66b38d6ca781e66737fa23aaa214103e7910a0914548b72661560b48149de37 WatchSource:0}: Error finding container b66b38d6ca781e66737fa23aaa214103e7910a0914548b72661560b48149de37: Status 404 returned error can't find the container with id b66b38d6ca781e66737fa23aaa214103e7910a0914548b72661560b48149de37 Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.721453 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:16 crc kubenswrapper[4958]: I0320 09:13:16.962071 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-2xx4x"] Mar 20 09:13:16 crc kubenswrapper[4958]: W0320 09:13:16.972044 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod46a3cd52_9d0b_48a4_bf54_39fb49633e56.slice/crio-51c9743aa15ee5b9123a7ba5a897b8beb3cd716957895c1c68980529afb074a5 WatchSource:0}: Error finding container 51c9743aa15ee5b9123a7ba5a897b8beb3cd716957895c1c68980529afb074a5: Status 404 returned error can't find the container with id 51c9743aa15ee5b9123a7ba5a897b8beb3cd716957895c1c68980529afb074a5 Mar 20 09:13:17 crc kubenswrapper[4958]: I0320 09:13:17.020646 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mgmxx" event={"ID":"5f1f6ba4-f472-4abb-a53d-72e17ac83d43","Type":"ContainerStarted","Data":"b66b38d6ca781e66737fa23aaa214103e7910a0914548b72661560b48149de37"} Mar 20 09:13:17 crc kubenswrapper[4958]: I0320 09:13:17.022117 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" event={"ID":"46a3cd52-9d0b-48a4-bf54-39fb49633e56","Type":"ContainerStarted","Data":"51c9743aa15ee5b9123a7ba5a897b8beb3cd716957895c1c68980529afb074a5"} Mar 20 09:13:17 crc kubenswrapper[4958]: I0320 09:13:17.023756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" event={"ID":"533c37c3-c235-4cc8-9937-96afff9fe513","Type":"ContainerStarted","Data":"8ac2d4e00ac3a5ab6b5dbf3aa0738789c131015965018266d7d199f101ae14e9"} Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.046335 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" event={"ID":"46a3cd52-9d0b-48a4-bf54-39fb49633e56","Type":"ContainerStarted","Data":"f263aa1e3411a64abdb9dad42767eee997335bc03c85b668b6a21d663f8a8fa9"} Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.046875 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.049012 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" event={"ID":"533c37c3-c235-4cc8-9937-96afff9fe513","Type":"ContainerStarted","Data":"41e5ac5f5654f768eac4d50133ef851c2355551cea2ef258f781db0c3528a330"} Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.051051 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-mgmxx" event={"ID":"5f1f6ba4-f472-4abb-a53d-72e17ac83d43","Type":"ContainerStarted","Data":"ae9c6af3219bd7b5797387a46b1368303c6a8e3bd50b70e277936f4d2086d374"} Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.067387 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" podStartSLOduration=1.921982297 podStartE2EDuration="5.067368163s" podCreationTimestamp="2026-03-20 09:13:16 +0000 UTC" firstStartedPulling="2026-03-20 09:13:16.976606249 +0000 UTC m=+817.298622207" lastFinishedPulling="2026-03-20 09:13:20.121992115 +0000 UTC m=+820.444008073" observedRunningTime="2026-03-20 09:13:21.064772361 +0000 UTC m=+821.386788329" watchObservedRunningTime="2026-03-20 09:13:21.067368163 +0000 UTC m=+821.389384121" Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.084248 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-67r7n" podStartSLOduration=1.562602809 podStartE2EDuration="5.084227488s" podCreationTimestamp="2026-03-20 09:13:16 +0000 UTC" firstStartedPulling="2026-03-20 09:13:16.623943647 +0000 UTC m=+816.945959605" lastFinishedPulling="2026-03-20 09:13:20.145568326 +0000 UTC m=+820.467584284" observedRunningTime="2026-03-20 09:13:21.080855795 +0000 UTC m=+821.402871763" watchObservedRunningTime="2026-03-20 09:13:21.084227488 +0000 UTC m=+821.406243446" Mar 20 09:13:21 crc kubenswrapper[4958]: I0320 09:13:21.109302 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-mgmxx" podStartSLOduration=1.661223751 podStartE2EDuration="5.10928258s" podCreationTimestamp="2026-03-20 09:13:16 +0000 UTC" firstStartedPulling="2026-03-20 09:13:16.680127477 +0000 UTC m=+817.002143445" lastFinishedPulling="2026-03-20 09:13:20.128186316 +0000 UTC m=+820.450202274" observedRunningTime="2026-03-20 09:13:21.107045848 +0000 UTC m=+821.429061806" watchObservedRunningTime="2026-03-20 09:13:21.10928258 +0000 UTC m=+821.431298538" Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.917466 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmjtz"] Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918473 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-controller" containerID="cri-o://29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918581 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="northd" containerID="cri-o://8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918579 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918689 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-acl-logging" containerID="cri-o://7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918615 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-node" containerID="cri-o://ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918793 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="sbdb" containerID="cri-o://f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.918564 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="nbdb" containerID="cri-o://ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" gracePeriod=30 Mar 20 09:13:24 crc kubenswrapper[4958]: I0320 09:13:24.966207 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" containerID="cri-o://a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" gracePeriod=30 Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.078476 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lht4x_1479666a-d3f9-47dc-aa36-45cc7425d7ee/kube-multus/1.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.080373 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lht4x_1479666a-d3f9-47dc-aa36-45cc7425d7ee/kube-multus/0.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.080413 4958 generic.go:334] "Generic (PLEG): container finished" podID="1479666a-d3f9-47dc-aa36-45cc7425d7ee" containerID="c1fa38ee671c6c3b38ada148c663ec96fd3a75dee770fb81c797ad6fa7b1b033" exitCode=2 Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.080466 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lht4x" event={"ID":"1479666a-d3f9-47dc-aa36-45cc7425d7ee","Type":"ContainerDied","Data":"c1fa38ee671c6c3b38ada148c663ec96fd3a75dee770fb81c797ad6fa7b1b033"} Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.080511 4958 scope.go:117] "RemoveContainer" containerID="b697f9de42fa8d16a6b245b03a40dac3380616efe86805c06d08df6424c49623" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.081118 4958 scope.go:117] "RemoveContainer" containerID="c1fa38ee671c6c3b38ada148c663ec96fd3a75dee770fb81c797ad6fa7b1b033" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.097729 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/2.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.101198 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovn-acl-logging/0.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.101789 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovn-controller/0.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102413 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" exitCode=0 Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102510 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" exitCode=0 Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102606 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" exitCode=143 Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102676 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" exitCode=143 Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102537 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a"} Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102814 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211"} Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102878 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6"} Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.102940 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4"} Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.257237 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/2.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.259447 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovn-acl-logging/0.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.259917 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovn-controller/0.log" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.260423 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.309888 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-8fffc"] Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310113 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kubecfg-setup" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310126 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kubecfg-setup" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310138 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310144 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310154 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="sbdb" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310160 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="sbdb" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310169 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310175 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310183 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310188 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310197 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-acl-logging" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310204 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-acl-logging" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310215 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="nbdb" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310222 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="nbdb" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310229 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-node" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310239 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-node" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310247 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310253 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310262 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="northd" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310268 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="northd" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310351 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="nbdb" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310361 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310369 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-acl-logging" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310378 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovn-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310384 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="sbdb" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310392 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310400 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="northd" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310406 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310412 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="kube-rbac-proxy-node" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310420 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310504 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310510 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: E0320 09:13:25.310523 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310530 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.310638 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="eb4de400-dc39-4926-8311-279b913e5871" containerName="ovnkube-controller" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.312428 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359343 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-script-lib\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359410 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-bin\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359448 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-slash\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359475 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-netns\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359503 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpg76\" (UniqueName: \"kubernetes.io/projected/eb4de400-dc39-4926-8311-279b913e5871-kube-api-access-gpg76\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359552 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb4de400-dc39-4926-8311-279b913e5871-ovn-node-metrics-cert\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359565 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359619 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-log-socket\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359656 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-kubelet\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359578 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359693 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-ovn\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359615 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-slash" (OuterVolumeSpecName: "host-slash") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359646 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-log-socket" (OuterVolumeSpecName: "log-socket") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359679 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359725 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-systemd\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359743 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359753 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-config\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359788 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-ovn-kubernetes\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359820 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-node-log\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359848 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-netd\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359872 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-var-lib-cni-networks-ovn-kubernetes\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359909 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-var-lib-openvswitch\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359959 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-openvswitch\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.359988 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-systemd-units\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360008 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-etc-openvswitch\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360029 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-env-overrides\") pod \"eb4de400-dc39-4926-8311-279b913e5871\" (UID: \"eb4de400-dc39-4926-8311-279b913e5871\") " Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360117 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360158 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360181 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360205 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-node-log" (OuterVolumeSpecName: "node-log") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360229 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360315 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovnkube-config\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360358 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovnkube-script-lib\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360388 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360422 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-cni-netd\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360576 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360589 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-slash\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360610 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360631 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360654 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360661 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360660 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-systemd-units\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360860 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-systemd\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360891 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-kubelet\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360914 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-node-log\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360933 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m9tk9\" (UniqueName: \"kubernetes.io/projected/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-kube-api-access-m9tk9\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360956 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-log-socket\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360971 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovn-node-metrics-cert\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.360986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-cni-bin\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361009 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361018 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-ovn\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361045 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361148 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-var-lib-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361251 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361317 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-etc-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361345 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-env-overrides\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361429 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-run-netns\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361532 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361550 4958 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361562 4958 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361574 4958 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361586 4958 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361619 4958 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361632 4958 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361644 4958 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361658 4958 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361670 4958 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361683 4958 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361718 4958 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361731 4958 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361745 4958 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361758 4958 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361771 4958 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.361784 4958 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/eb4de400-dc39-4926-8311-279b913e5871-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.366140 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb4de400-dc39-4926-8311-279b913e5871-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.366146 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb4de400-dc39-4926-8311-279b913e5871-kube-api-access-gpg76" (OuterVolumeSpecName: "kube-api-access-gpg76") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "kube-api-access-gpg76". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.377216 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "eb4de400-dc39-4926-8311-279b913e5871" (UID: "eb4de400-dc39-4926-8311-279b913e5871"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.470567 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-etc-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.470650 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-env-overrides\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.470716 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-run-netns\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.470732 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-etc-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.471792 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-env-overrides\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.472753 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovnkube-config\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.470766 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovnkube-config\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.472880 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovnkube-script-lib\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.472925 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.472968 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-cni-netd\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473032 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-slash\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473094 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-systemd-units\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473214 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-run-netns\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473509 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-slash\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473543 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473561 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-cni-netd\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.473743 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-systemd-units\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474026 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-systemd\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474156 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-kubelet\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474191 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovnkube-script-lib\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474221 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-systemd\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474231 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-node-log\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474201 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-node-log\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474272 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-kubelet\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474288 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m9tk9\" (UniqueName: \"kubernetes.io/projected/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-kube-api-access-m9tk9\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474327 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-log-socket\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474347 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-cni-bin\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474372 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovn-node-metrics-cert\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474401 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-ovn\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474441 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474471 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-var-lib-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474501 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474555 4958 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/eb4de400-dc39-4926-8311-279b913e5871-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474571 4958 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/eb4de400-dc39-4926-8311-279b913e5871-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474584 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpg76\" (UniqueName: \"kubernetes.io/projected/eb4de400-dc39-4926-8311-279b913e5871-kube-api-access-gpg76\") on node \"crc\" DevicePath \"\"" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474632 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-run-ovn-kubernetes\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474665 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-log-socket\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474703 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-run-ovn\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474739 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-cni-bin\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474862 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.474911 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-var-lib-openvswitch\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.478588 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-ovn-node-metrics-cert\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.499416 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m9tk9\" (UniqueName: \"kubernetes.io/projected/601cfaa3-4ec9-45f4-8525-9cfd79ee5737-kube-api-access-m9tk9\") pod \"ovnkube-node-8fffc\" (UID: \"601cfaa3-4ec9-45f4-8525-9cfd79ee5737\") " pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: I0320 09:13:25.631426 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:25 crc kubenswrapper[4958]: W0320 09:13:25.653937 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod601cfaa3_4ec9_45f4_8525_9cfd79ee5737.slice/crio-32c07e428914b2f94b3c73e840ed30ead34a06b34470c4b84bcbe9b51e3a0de3 WatchSource:0}: Error finding container 32c07e428914b2f94b3c73e840ed30ead34a06b34470c4b84bcbe9b51e3a0de3: Status 404 returned error can't find the container with id 32c07e428914b2f94b3c73e840ed30ead34a06b34470c4b84bcbe9b51e3a0de3 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.112830 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lht4x_1479666a-d3f9-47dc-aa36-45cc7425d7ee/kube-multus/1.log" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.113275 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lht4x" event={"ID":"1479666a-d3f9-47dc-aa36-45cc7425d7ee","Type":"ContainerStarted","Data":"a99e4e94ce30ef4ce8bae20e2c6ecb12fb2ac9ed6224deb1e75341f56d0f1869"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.116643 4958 generic.go:334] "Generic (PLEG): container finished" podID="601cfaa3-4ec9-45f4-8525-9cfd79ee5737" containerID="20c6ac66eae90a774149ea920a7739820c490289ee210ac9e69673ec53f257fc" exitCode=0 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.116787 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerDied","Data":"20c6ac66eae90a774149ea920a7739820c490289ee210ac9e69673ec53f257fc"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.116993 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"32c07e428914b2f94b3c73e840ed30ead34a06b34470c4b84bcbe9b51e3a0de3"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.120667 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovnkube-controller/2.log" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.136082 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovn-acl-logging/0.log" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137180 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-tmjtz_eb4de400-dc39-4926-8311-279b913e5871/ovn-controller/0.log" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137522 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" exitCode=0 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137544 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" exitCode=0 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137553 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" exitCode=0 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137562 4958 generic.go:334] "Generic (PLEG): container finished" podID="eb4de400-dc39-4926-8311-279b913e5871" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" exitCode=0 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137617 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137659 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137854 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137869 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" event={"ID":"eb4de400-dc39-4926-8311-279b913e5871","Type":"ContainerDied","Data":"e1d4a03bf8affed2ba168af7dff8dc9fe51eb5be068bd9fa84b35e70a3eeffd6"} Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.137918 4958 scope.go:117] "RemoveContainer" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.138040 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-tmjtz" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.172342 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.220638 4958 scope.go:117] "RemoveContainer" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.243377 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmjtz"] Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.246969 4958 scope.go:117] "RemoveContainer" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.252518 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-tmjtz"] Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.264772 4958 scope.go:117] "RemoveContainer" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.284095 4958 scope.go:117] "RemoveContainer" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.307246 4958 scope.go:117] "RemoveContainer" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.323665 4958 scope.go:117] "RemoveContainer" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.345670 4958 scope.go:117] "RemoveContainer" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.375764 4958 scope.go:117] "RemoveContainer" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.398425 4958 scope.go:117] "RemoveContainer" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.399944 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": container with ID starting with a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9 not found: ID does not exist" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.400006 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9"} err="failed to get container status \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": rpc error: code = NotFound desc = could not find container \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": container with ID starting with a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.400046 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.400730 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": container with ID starting with 14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7 not found: ID does not exist" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.400780 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7"} err="failed to get container status \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": rpc error: code = NotFound desc = could not find container \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": container with ID starting with 14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.400814 4958 scope.go:117] "RemoveContainer" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.401228 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": container with ID starting with f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d not found: ID does not exist" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.401300 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d"} err="failed to get container status \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": rpc error: code = NotFound desc = could not find container \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": container with ID starting with f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.401348 4958 scope.go:117] "RemoveContainer" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.401812 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": container with ID starting with ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7 not found: ID does not exist" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.401846 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7"} err="failed to get container status \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": rpc error: code = NotFound desc = could not find container \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": container with ID starting with ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.401871 4958 scope.go:117] "RemoveContainer" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.402353 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": container with ID starting with 8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9 not found: ID does not exist" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.402384 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9"} err="failed to get container status \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": rpc error: code = NotFound desc = could not find container \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": container with ID starting with 8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.402403 4958 scope.go:117] "RemoveContainer" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.402740 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": container with ID starting with 4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a not found: ID does not exist" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.402766 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a"} err="failed to get container status \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": rpc error: code = NotFound desc = could not find container \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": container with ID starting with 4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.402780 4958 scope.go:117] "RemoveContainer" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.403056 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": container with ID starting with ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211 not found: ID does not exist" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.403092 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211"} err="failed to get container status \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": rpc error: code = NotFound desc = could not find container \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": container with ID starting with ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.403114 4958 scope.go:117] "RemoveContainer" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.403490 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": container with ID starting with 7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6 not found: ID does not exist" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.403515 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6"} err="failed to get container status \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": rpc error: code = NotFound desc = could not find container \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": container with ID starting with 7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.403532 4958 scope.go:117] "RemoveContainer" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.403859 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": container with ID starting with 29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4 not found: ID does not exist" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.403904 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4"} err="failed to get container status \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": rpc error: code = NotFound desc = could not find container \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": container with ID starting with 29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.403934 4958 scope.go:117] "RemoveContainer" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" Mar 20 09:13:26 crc kubenswrapper[4958]: E0320 09:13:26.404319 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": container with ID starting with 1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68 not found: ID does not exist" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.404358 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68"} err="failed to get container status \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": rpc error: code = NotFound desc = could not find container \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": container with ID starting with 1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.404382 4958 scope.go:117] "RemoveContainer" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.404853 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9"} err="failed to get container status \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": rpc error: code = NotFound desc = could not find container \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": container with ID starting with a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.404890 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.405188 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7"} err="failed to get container status \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": rpc error: code = NotFound desc = could not find container \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": container with ID starting with 14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.405224 4958 scope.go:117] "RemoveContainer" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.405490 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d"} err="failed to get container status \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": rpc error: code = NotFound desc = could not find container \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": container with ID starting with f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.405519 4958 scope.go:117] "RemoveContainer" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.405774 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7"} err="failed to get container status \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": rpc error: code = NotFound desc = could not find container \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": container with ID starting with ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.405811 4958 scope.go:117] "RemoveContainer" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.406162 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9"} err="failed to get container status \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": rpc error: code = NotFound desc = could not find container \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": container with ID starting with 8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.406203 4958 scope.go:117] "RemoveContainer" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.406475 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a"} err="failed to get container status \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": rpc error: code = NotFound desc = could not find container \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": container with ID starting with 4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.406505 4958 scope.go:117] "RemoveContainer" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.407045 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211"} err="failed to get container status \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": rpc error: code = NotFound desc = could not find container \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": container with ID starting with ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.407080 4958 scope.go:117] "RemoveContainer" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.407410 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6"} err="failed to get container status \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": rpc error: code = NotFound desc = could not find container \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": container with ID starting with 7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.407439 4958 scope.go:117] "RemoveContainer" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.407836 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4"} err="failed to get container status \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": rpc error: code = NotFound desc = could not find container \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": container with ID starting with 29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.407876 4958 scope.go:117] "RemoveContainer" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.408246 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68"} err="failed to get container status \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": rpc error: code = NotFound desc = could not find container \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": container with ID starting with 1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.408274 4958 scope.go:117] "RemoveContainer" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.408578 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9"} err="failed to get container status \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": rpc error: code = NotFound desc = could not find container \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": container with ID starting with a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.408841 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.409098 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7"} err="failed to get container status \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": rpc error: code = NotFound desc = could not find container \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": container with ID starting with 14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.409134 4958 scope.go:117] "RemoveContainer" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.409459 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d"} err="failed to get container status \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": rpc error: code = NotFound desc = could not find container \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": container with ID starting with f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.409486 4958 scope.go:117] "RemoveContainer" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.409990 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7"} err="failed to get container status \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": rpc error: code = NotFound desc = could not find container \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": container with ID starting with ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.410032 4958 scope.go:117] "RemoveContainer" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.410392 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9"} err="failed to get container status \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": rpc error: code = NotFound desc = could not find container \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": container with ID starting with 8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.410420 4958 scope.go:117] "RemoveContainer" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.410832 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a"} err="failed to get container status \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": rpc error: code = NotFound desc = could not find container \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": container with ID starting with 4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.410861 4958 scope.go:117] "RemoveContainer" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.411271 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211"} err="failed to get container status \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": rpc error: code = NotFound desc = could not find container \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": container with ID starting with ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.411306 4958 scope.go:117] "RemoveContainer" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.411689 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6"} err="failed to get container status \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": rpc error: code = NotFound desc = could not find container \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": container with ID starting with 7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.411720 4958 scope.go:117] "RemoveContainer" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.412047 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4"} err="failed to get container status \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": rpc error: code = NotFound desc = could not find container \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": container with ID starting with 29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.412078 4958 scope.go:117] "RemoveContainer" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.412413 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68"} err="failed to get container status \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": rpc error: code = NotFound desc = could not find container \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": container with ID starting with 1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.412443 4958 scope.go:117] "RemoveContainer" containerID="a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.412838 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9"} err="failed to get container status \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": rpc error: code = NotFound desc = could not find container \"a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9\": container with ID starting with a94aa5fe7c76d7c39038d8698e95f47d2ee47e15d9d5f3b45abc054e46dfc2a9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.412873 4958 scope.go:117] "RemoveContainer" containerID="14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.413247 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7"} err="failed to get container status \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": rpc error: code = NotFound desc = could not find container \"14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7\": container with ID starting with 14fc461c0de62f2c2569753ac66a2fe73856e7149a76d95e8682ba82af5e2ae7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.413286 4958 scope.go:117] "RemoveContainer" containerID="f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.413680 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d"} err="failed to get container status \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": rpc error: code = NotFound desc = could not find container \"f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d\": container with ID starting with f38abfdb378302a003edeae38bb49357866b2cf80646136bdc9f725a5337412d not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.413711 4958 scope.go:117] "RemoveContainer" containerID="ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.414034 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7"} err="failed to get container status \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": rpc error: code = NotFound desc = could not find container \"ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7\": container with ID starting with ad21453809cce16d0c29fb6982cc0b96ad473dd7c0da310cb1febd5d0c35b0c7 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.414066 4958 scope.go:117] "RemoveContainer" containerID="8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.414330 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9"} err="failed to get container status \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": rpc error: code = NotFound desc = could not find container \"8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9\": container with ID starting with 8b0b560642e98f573e409debb58f1f4e21bca8ec89f7dcf76f11f7f26a45c5d9 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.414365 4958 scope.go:117] "RemoveContainer" containerID="4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.414935 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a"} err="failed to get container status \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": rpc error: code = NotFound desc = could not find container \"4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a\": container with ID starting with 4982d5da52cf6beb2cc01a10ed57d042e8b258f1c7bf920c1f6e2cf16a99f63a not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.414962 4958 scope.go:117] "RemoveContainer" containerID="ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.415256 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211"} err="failed to get container status \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": rpc error: code = NotFound desc = could not find container \"ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211\": container with ID starting with ef58594e33e954a55a01a0397a220279c515261b03840536ebec2355e1e00211 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.415294 4958 scope.go:117] "RemoveContainer" containerID="7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.415555 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6"} err="failed to get container status \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": rpc error: code = NotFound desc = could not find container \"7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6\": container with ID starting with 7c59dfdc9890d02fa47458d9740285e9f3cb21fe3f57614c72e4a654d8ed45d6 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.415582 4958 scope.go:117] "RemoveContainer" containerID="29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.415943 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4"} err="failed to get container status \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": rpc error: code = NotFound desc = could not find container \"29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4\": container with ID starting with 29ed7ffc0d9c731235d80ff97c3f943b0f07ecae13479a972d730a1360a8f3e4 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.415966 4958 scope.go:117] "RemoveContainer" containerID="1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.416304 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68"} err="failed to get container status \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": rpc error: code = NotFound desc = could not find container \"1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68\": container with ID starting with 1fe6ba4e3f04c53051a3753e20f42852ceb37c9a49b322d943255088e531ab68 not found: ID does not exist" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.442173 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb4de400-dc39-4926-8311-279b913e5871" path="/var/lib/kubelet/pods/eb4de400-dc39-4926-8311-279b913e5871/volumes" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.521937 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.522015 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.522079 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.522952 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"cddc3aaf749f620c4810fa0b2192721051e7b180c369b36b46b439825fe97a42"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.523041 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://cddc3aaf749f620c4810fa0b2192721051e7b180c369b36b46b439825fe97a42" gracePeriod=600 Mar 20 09:13:26 crc kubenswrapper[4958]: I0320 09:13:26.728468 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-2xx4x" Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.150479 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"0899e81ab6d026e34ddd11712a173a973babbad82a181c8c46ac843a31e3f8bb"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.150837 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"57af181a2f4c4bea6d5445711af4bc32e31473824522b23a632e0feddc457cbf"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.150851 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"c726b0c2a046e78c62404c36ef896ea8dd8df2fad932a76d44d3b83044a35e08"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.150864 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"26da88f001fa1525c6b1c5f35cfe7cf62e66ca3c97a6abd18061385c1aa5b797"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.150875 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"aa667008a926b1c86ab35151c866f11383403462288a308ddfae4888ecdacd73"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.150884 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"15d408880ab23fb2e54bd1ce97bf1f3cf0c7d4afc1acc4986db8549fd72410c7"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.154438 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="cddc3aaf749f620c4810fa0b2192721051e7b180c369b36b46b439825fe97a42" exitCode=0 Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.154493 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"cddc3aaf749f620c4810fa0b2192721051e7b180c369b36b46b439825fe97a42"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.154512 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"d50121cef1dafbc948002311d0250ee4e915179ff897da522e2cdd9606be5fc6"} Mar 20 09:13:27 crc kubenswrapper[4958]: I0320 09:13:27.154532 4958 scope.go:117] "RemoveContainer" containerID="b4b8fd112ac49f49535801cfe26058a0f5192af2f7e3f7bd074e82803f42be38" Mar 20 09:13:29 crc kubenswrapper[4958]: I0320 09:13:29.174773 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"60e0bbb89c812ab5eda9c84eb30c10b043e75426788f0bed546bb2861a9e5904"} Mar 20 09:13:32 crc kubenswrapper[4958]: I0320 09:13:32.197844 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" event={"ID":"601cfaa3-4ec9-45f4-8525-9cfd79ee5737","Type":"ContainerStarted","Data":"e49a3488d046f44c0ca342e89e35096f77d4ce7b6ab972e4ae1325a0b7a1294a"} Mar 20 09:13:32 crc kubenswrapper[4958]: I0320 09:13:32.198811 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:32 crc kubenswrapper[4958]: I0320 09:13:32.232808 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" podStartSLOduration=7.232777483 podStartE2EDuration="7.232777483s" podCreationTimestamp="2026-03-20 09:13:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:13:32.23088263 +0000 UTC m=+832.552898628" watchObservedRunningTime="2026-03-20 09:13:32.232777483 +0000 UTC m=+832.554793451" Mar 20 09:13:32 crc kubenswrapper[4958]: I0320 09:13:32.237352 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:33 crc kubenswrapper[4958]: I0320 09:13:33.203819 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:33 crc kubenswrapper[4958]: I0320 09:13:33.203877 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:33 crc kubenswrapper[4958]: I0320 09:13:33.282342 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:13:55 crc kubenswrapper[4958]: I0320 09:13:55.658641 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-8fffc" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.143125 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566634-k4tm9"] Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.147282 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.149621 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.149642 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.152042 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-k4tm9"] Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.153210 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.296270 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqvxz\" (UniqueName: \"kubernetes.io/projected/3481c9df-80a0-42c9-a2c3-ba845e0f14c0-kube-api-access-tqvxz\") pod \"auto-csr-approver-29566634-k4tm9\" (UID: \"3481c9df-80a0-42c9-a2c3-ba845e0f14c0\") " pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.398018 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tqvxz\" (UniqueName: \"kubernetes.io/projected/3481c9df-80a0-42c9-a2c3-ba845e0f14c0-kube-api-access-tqvxz\") pod \"auto-csr-approver-29566634-k4tm9\" (UID: \"3481c9df-80a0-42c9-a2c3-ba845e0f14c0\") " pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.421034 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqvxz\" (UniqueName: \"kubernetes.io/projected/3481c9df-80a0-42c9-a2c3-ba845e0f14c0-kube-api-access-tqvxz\") pod \"auto-csr-approver-29566634-k4tm9\" (UID: \"3481c9df-80a0-42c9-a2c3-ba845e0f14c0\") " pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.474962 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:00 crc kubenswrapper[4958]: I0320 09:14:00.741476 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-k4tm9"] Mar 20 09:14:01 crc kubenswrapper[4958]: I0320 09:14:01.396476 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" event={"ID":"3481c9df-80a0-42c9-a2c3-ba845e0f14c0","Type":"ContainerStarted","Data":"82572aa71566c61e44748dadc39c30ad1fda210f69ce0525714dac0c0938fad3"} Mar 20 09:14:02 crc kubenswrapper[4958]: I0320 09:14:02.406636 4958 generic.go:334] "Generic (PLEG): container finished" podID="3481c9df-80a0-42c9-a2c3-ba845e0f14c0" containerID="4cce592f1c1354f99af4d2e887753ac54bcaf92082b1fb9167af7935ed89bdbb" exitCode=0 Mar 20 09:14:02 crc kubenswrapper[4958]: I0320 09:14:02.406736 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" event={"ID":"3481c9df-80a0-42c9-a2c3-ba845e0f14c0","Type":"ContainerDied","Data":"4cce592f1c1354f99af4d2e887753ac54bcaf92082b1fb9167af7935ed89bdbb"} Mar 20 09:14:03 crc kubenswrapper[4958]: I0320 09:14:03.652621 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:03 crc kubenswrapper[4958]: I0320 09:14:03.770753 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tqvxz\" (UniqueName: \"kubernetes.io/projected/3481c9df-80a0-42c9-a2c3-ba845e0f14c0-kube-api-access-tqvxz\") pod \"3481c9df-80a0-42c9-a2c3-ba845e0f14c0\" (UID: \"3481c9df-80a0-42c9-a2c3-ba845e0f14c0\") " Mar 20 09:14:03 crc kubenswrapper[4958]: I0320 09:14:03.777396 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3481c9df-80a0-42c9-a2c3-ba845e0f14c0-kube-api-access-tqvxz" (OuterVolumeSpecName: "kube-api-access-tqvxz") pod "3481c9df-80a0-42c9-a2c3-ba845e0f14c0" (UID: "3481c9df-80a0-42c9-a2c3-ba845e0f14c0"). InnerVolumeSpecName "kube-api-access-tqvxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:03 crc kubenswrapper[4958]: I0320 09:14:03.872907 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tqvxz\" (UniqueName: \"kubernetes.io/projected/3481c9df-80a0-42c9-a2c3-ba845e0f14c0-kube-api-access-tqvxz\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:04 crc kubenswrapper[4958]: I0320 09:14:04.424105 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" event={"ID":"3481c9df-80a0-42c9-a2c3-ba845e0f14c0","Type":"ContainerDied","Data":"82572aa71566c61e44748dadc39c30ad1fda210f69ce0525714dac0c0938fad3"} Mar 20 09:14:04 crc kubenswrapper[4958]: I0320 09:14:04.424253 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82572aa71566c61e44748dadc39c30ad1fda210f69ce0525714dac0c0938fad3" Mar 20 09:14:04 crc kubenswrapper[4958]: I0320 09:14:04.424185 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566634-k4tm9" Mar 20 09:14:04 crc kubenswrapper[4958]: I0320 09:14:04.734695 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-mrzx6"] Mar 20 09:14:04 crc kubenswrapper[4958]: I0320 09:14:04.738314 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566628-mrzx6"] Mar 20 09:14:06 crc kubenswrapper[4958]: I0320 09:14:06.452510 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e" path="/var/lib/kubelet/pods/b5e3b3a2-ca6c-453c-8f17-cc26bdb5ad0e/volumes" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.584924 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn"] Mar 20 09:14:14 crc kubenswrapper[4958]: E0320 09:14:14.585788 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3481c9df-80a0-42c9-a2c3-ba845e0f14c0" containerName="oc" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.585809 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="3481c9df-80a0-42c9-a2c3-ba845e0f14c0" containerName="oc" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.585947 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="3481c9df-80a0-42c9-a2c3-ba845e0f14c0" containerName="oc" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.586918 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.589100 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.598908 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn"] Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.740619 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8glzw\" (UniqueName: \"kubernetes.io/projected/e0b23e56-fd65-47bf-9aae-fc730031e274-kube-api-access-8glzw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.741222 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.741254 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.842715 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8glzw\" (UniqueName: \"kubernetes.io/projected/e0b23e56-fd65-47bf-9aae-fc730031e274-kube-api-access-8glzw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.842843 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.842898 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.843944 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.844100 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.881820 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8glzw\" (UniqueName: \"kubernetes.io/projected/e0b23e56-fd65-47bf-9aae-fc730031e274-kube-api-access-8glzw\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:14 crc kubenswrapper[4958]: I0320 09:14:14.905445 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:15 crc kubenswrapper[4958]: I0320 09:14:15.327970 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn"] Mar 20 09:14:15 crc kubenswrapper[4958]: I0320 09:14:15.826528 4958 generic.go:334] "Generic (PLEG): container finished" podID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerID="129986f0453f7ea1ec98039d519dcb8f3eacf4ab623559fc9fa35cd93d8b7ee4" exitCode=0 Mar 20 09:14:15 crc kubenswrapper[4958]: I0320 09:14:15.826633 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" event={"ID":"e0b23e56-fd65-47bf-9aae-fc730031e274","Type":"ContainerDied","Data":"129986f0453f7ea1ec98039d519dcb8f3eacf4ab623559fc9fa35cd93d8b7ee4"} Mar 20 09:14:15 crc kubenswrapper[4958]: I0320 09:14:15.826730 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" event={"ID":"e0b23e56-fd65-47bf-9aae-fc730031e274","Type":"ContainerStarted","Data":"452d4a21bfc71bfc0ef741a7738732061ccc4d7f216e58e8101a2661e752fd13"} Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.513818 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-tnsmc"] Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.515994 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.522308 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnsmc"] Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.584044 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-catalog-content\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.584111 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hscz\" (UniqueName: \"kubernetes.io/projected/62d051f7-25af-4220-9753-263ac96a9e67-kube-api-access-6hscz\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.584162 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-utilities\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.685888 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-utilities\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.686006 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-catalog-content\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.686036 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6hscz\" (UniqueName: \"kubernetes.io/projected/62d051f7-25af-4220-9753-263ac96a9e67-kube-api-access-6hscz\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.686867 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-utilities\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.686911 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-catalog-content\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.719316 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6hscz\" (UniqueName: \"kubernetes.io/projected/62d051f7-25af-4220-9753-263ac96a9e67-kube-api-access-6hscz\") pod \"redhat-operators-tnsmc\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:16 crc kubenswrapper[4958]: I0320 09:14:16.854022 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:17 crc kubenswrapper[4958]: I0320 09:14:17.105542 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-tnsmc"] Mar 20 09:14:17 crc kubenswrapper[4958]: W0320 09:14:17.171439 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod62d051f7_25af_4220_9753_263ac96a9e67.slice/crio-e1e739be6b0aa725cc0afe43c805fe8e0d05679c728dc214d9a2b71b31e262fc WatchSource:0}: Error finding container e1e739be6b0aa725cc0afe43c805fe8e0d05679c728dc214d9a2b71b31e262fc: Status 404 returned error can't find the container with id e1e739be6b0aa725cc0afe43c805fe8e0d05679c728dc214d9a2b71b31e262fc Mar 20 09:14:17 crc kubenswrapper[4958]: I0320 09:14:17.840144 4958 generic.go:334] "Generic (PLEG): container finished" podID="62d051f7-25af-4220-9753-263ac96a9e67" containerID="733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b" exitCode=0 Mar 20 09:14:17 crc kubenswrapper[4958]: I0320 09:14:17.840243 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerDied","Data":"733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b"} Mar 20 09:14:17 crc kubenswrapper[4958]: I0320 09:14:17.840287 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerStarted","Data":"e1e739be6b0aa725cc0afe43c805fe8e0d05679c728dc214d9a2b71b31e262fc"} Mar 20 09:14:17 crc kubenswrapper[4958]: I0320 09:14:17.842067 4958 generic.go:334] "Generic (PLEG): container finished" podID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerID="233ffde61cf3a3e59ffecebb49a3071b67163eeaa2f966bb58597ac015bce176" exitCode=0 Mar 20 09:14:17 crc kubenswrapper[4958]: I0320 09:14:17.842100 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" event={"ID":"e0b23e56-fd65-47bf-9aae-fc730031e274","Type":"ContainerDied","Data":"233ffde61cf3a3e59ffecebb49a3071b67163eeaa2f966bb58597ac015bce176"} Mar 20 09:14:18 crc kubenswrapper[4958]: I0320 09:14:18.852837 4958 generic.go:334] "Generic (PLEG): container finished" podID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerID="01a49a1be6da631c21d88ce75dc90195acbbcd2adcbb32ed2c6d15b049edfc55" exitCode=0 Mar 20 09:14:18 crc kubenswrapper[4958]: I0320 09:14:18.852916 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" event={"ID":"e0b23e56-fd65-47bf-9aae-fc730031e274","Type":"ContainerDied","Data":"01a49a1be6da631c21d88ce75dc90195acbbcd2adcbb32ed2c6d15b049edfc55"} Mar 20 09:14:19 crc kubenswrapper[4958]: I0320 09:14:19.865009 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerStarted","Data":"6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33"} Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.270485 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.340174 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-bundle\") pod \"e0b23e56-fd65-47bf-9aae-fc730031e274\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.340270 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8glzw\" (UniqueName: \"kubernetes.io/projected/e0b23e56-fd65-47bf-9aae-fc730031e274-kube-api-access-8glzw\") pod \"e0b23e56-fd65-47bf-9aae-fc730031e274\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.340337 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-util\") pod \"e0b23e56-fd65-47bf-9aae-fc730031e274\" (UID: \"e0b23e56-fd65-47bf-9aae-fc730031e274\") " Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.341037 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-bundle" (OuterVolumeSpecName: "bundle") pod "e0b23e56-fd65-47bf-9aae-fc730031e274" (UID: "e0b23e56-fd65-47bf-9aae-fc730031e274"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.346821 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e0b23e56-fd65-47bf-9aae-fc730031e274-kube-api-access-8glzw" (OuterVolumeSpecName: "kube-api-access-8glzw") pod "e0b23e56-fd65-47bf-9aae-fc730031e274" (UID: "e0b23e56-fd65-47bf-9aae-fc730031e274"). InnerVolumeSpecName "kube-api-access-8glzw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.361257 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-util" (OuterVolumeSpecName: "util") pod "e0b23e56-fd65-47bf-9aae-fc730031e274" (UID: "e0b23e56-fd65-47bf-9aae-fc730031e274"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.442403 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-util\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.442452 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e0b23e56-fd65-47bf-9aae-fc730031e274-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.442478 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8glzw\" (UniqueName: \"kubernetes.io/projected/e0b23e56-fd65-47bf-9aae-fc730031e274-kube-api-access-8glzw\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.874751 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" event={"ID":"e0b23e56-fd65-47bf-9aae-fc730031e274","Type":"ContainerDied","Data":"452d4a21bfc71bfc0ef741a7738732061ccc4d7f216e58e8101a2661e752fd13"} Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.874823 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="452d4a21bfc71bfc0ef741a7738732061ccc4d7f216e58e8101a2661e752fd13" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.874920 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn" Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.878971 4958 generic.go:334] "Generic (PLEG): container finished" podID="62d051f7-25af-4220-9753-263ac96a9e67" containerID="6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33" exitCode=0 Mar 20 09:14:20 crc kubenswrapper[4958]: I0320 09:14:20.879018 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerDied","Data":"6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33"} Mar 20 09:14:22 crc kubenswrapper[4958]: I0320 09:14:22.897376 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerStarted","Data":"a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59"} Mar 20 09:14:22 crc kubenswrapper[4958]: I0320 09:14:22.924486 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-tnsmc" podStartSLOduration=2.811467415 podStartE2EDuration="6.924461852s" podCreationTimestamp="2026-03-20 09:14:16 +0000 UTC" firstStartedPulling="2026-03-20 09:14:17.842367842 +0000 UTC m=+878.164383800" lastFinishedPulling="2026-03-20 09:14:21.955362279 +0000 UTC m=+882.277378237" observedRunningTime="2026-03-20 09:14:22.921390827 +0000 UTC m=+883.243406785" watchObservedRunningTime="2026-03-20 09:14:22.924461852 +0000 UTC m=+883.246477830" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.058556 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sjl76"] Mar 20 09:14:25 crc kubenswrapper[4958]: E0320 09:14:25.059340 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="extract" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.059356 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="extract" Mar 20 09:14:25 crc kubenswrapper[4958]: E0320 09:14:25.059373 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="util" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.059379 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="util" Mar 20 09:14:25 crc kubenswrapper[4958]: E0320 09:14:25.059396 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="pull" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.059404 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="pull" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.059523 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e0b23e56-fd65-47bf-9aae-fc730031e274" containerName="extract" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.060023 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.062162 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.062420 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.062612 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-b5qng" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.075890 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sjl76"] Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.220755 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwk9r\" (UniqueName: \"kubernetes.io/projected/f5dcbca6-977c-48d6-a65c-00cc3f7d8787-kube-api-access-vwk9r\") pod \"nmstate-operator-796d4cfff4-sjl76\" (UID: \"f5dcbca6-977c-48d6-a65c-00cc3f7d8787\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.322151 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vwk9r\" (UniqueName: \"kubernetes.io/projected/f5dcbca6-977c-48d6-a65c-00cc3f7d8787-kube-api-access-vwk9r\") pod \"nmstate-operator-796d4cfff4-sjl76\" (UID: \"f5dcbca6-977c-48d6-a65c-00cc3f7d8787\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.353461 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwk9r\" (UniqueName: \"kubernetes.io/projected/f5dcbca6-977c-48d6-a65c-00cc3f7d8787-kube-api-access-vwk9r\") pod \"nmstate-operator-796d4cfff4-sjl76\" (UID: \"f5dcbca6-977c-48d6-a65c-00cc3f7d8787\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.377864 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.634118 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-sjl76"] Mar 20 09:14:25 crc kubenswrapper[4958]: W0320 09:14:25.635787 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf5dcbca6_977c_48d6_a65c_00cc3f7d8787.slice/crio-1cf7e7477f92d2e664b300ec27f3ffd8ec7981f97eaa6d7fbaeed60399bc98a2 WatchSource:0}: Error finding container 1cf7e7477f92d2e664b300ec27f3ffd8ec7981f97eaa6d7fbaeed60399bc98a2: Status 404 returned error can't find the container with id 1cf7e7477f92d2e664b300ec27f3ffd8ec7981f97eaa6d7fbaeed60399bc98a2 Mar 20 09:14:25 crc kubenswrapper[4958]: I0320 09:14:25.918296 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" event={"ID":"f5dcbca6-977c-48d6-a65c-00cc3f7d8787","Type":"ContainerStarted","Data":"1cf7e7477f92d2e664b300ec27f3ffd8ec7981f97eaa6d7fbaeed60399bc98a2"} Mar 20 09:14:26 crc kubenswrapper[4958]: I0320 09:14:26.854848 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:26 crc kubenswrapper[4958]: I0320 09:14:26.855204 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:27 crc kubenswrapper[4958]: I0320 09:14:27.893538 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-tnsmc" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="registry-server" probeResult="failure" output=< Mar 20 09:14:27 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Mar 20 09:14:27 crc kubenswrapper[4958]: > Mar 20 09:14:28 crc kubenswrapper[4958]: I0320 09:14:28.941216 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" event={"ID":"f5dcbca6-977c-48d6-a65c-00cc3f7d8787","Type":"ContainerStarted","Data":"519e7c97eecddf241b7cbd13b38611f9d9aaf0212c486b487bff6151f5806fde"} Mar 20 09:14:28 crc kubenswrapper[4958]: I0320 09:14:28.965648 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-sjl76" podStartSLOduration=1.299744733 podStartE2EDuration="3.965628678s" podCreationTimestamp="2026-03-20 09:14:25 +0000 UTC" firstStartedPulling="2026-03-20 09:14:25.641448927 +0000 UTC m=+885.963464885" lastFinishedPulling="2026-03-20 09:14:28.307332872 +0000 UTC m=+888.629348830" observedRunningTime="2026-03-20 09:14:28.962986365 +0000 UTC m=+889.285002373" watchObservedRunningTime="2026-03-20 09:14:28.965628678 +0000 UTC m=+889.287644636" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.391298 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.393377 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.397631 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-bwpbr" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.403873 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.410794 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-kqv85"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.411677 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.415577 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.426907 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-kqv85"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.460529 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-jtx5n"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.461409 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.549424 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.550275 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.554365 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.554374 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.554504 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-fj5mq" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.556010 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-24g5g\" (UniqueName: \"kubernetes.io/projected/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-kube-api-access-24g5g\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.556078 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fpvd\" (UniqueName: \"kubernetes.io/projected/edbe510d-bcd7-465b-82e6-8425666a3dae-kube-api-access-5fpvd\") pod \"nmstate-metrics-9b8c8685d-j25jd\" (UID: \"edbe510d-bcd7-465b-82e6-8425666a3dae\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.556098 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.570860 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657774 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9c5966-5322-42c8-b89d-939904508cbf-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657826 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-nmstate-lock\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657851 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc9c5966-5322-42c8-b89d-939904508cbf-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657910 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-24g5g\" (UniqueName: \"kubernetes.io/projected/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-kube-api-access-24g5g\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657938 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gvwdz\" (UniqueName: \"kubernetes.io/projected/cc9c5966-5322-42c8-b89d-939904508cbf-kube-api-access-gvwdz\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-dbus-socket\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.657998 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fpvd\" (UniqueName: \"kubernetes.io/projected/edbe510d-bcd7-465b-82e6-8425666a3dae-kube-api-access-5fpvd\") pod \"nmstate-metrics-9b8c8685d-j25jd\" (UID: \"edbe510d-bcd7-465b-82e6-8425666a3dae\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.658019 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.658054 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-ovs-socket\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.658077 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flr85\" (UniqueName: \"kubernetes.io/projected/7462bd93-791f-45b3-943b-9c5ebfdf90ee-kube-api-access-flr85\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: E0320 09:14:34.658093 4958 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 20 09:14:34 crc kubenswrapper[4958]: E0320 09:14:34.658156 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-tls-key-pair podName:6c6f8675-4ddc-4254-ae04-40cd4b5199d6 nodeName:}" failed. No retries permitted until 2026-03-20 09:14:35.158134543 +0000 UTC m=+895.480150501 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-tls-key-pair") pod "nmstate-webhook-5f558f5558-kqv85" (UID: "6c6f8675-4ddc-4254-ae04-40cd4b5199d6") : secret "openshift-nmstate-webhook" not found Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.688807 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-24g5g\" (UniqueName: \"kubernetes.io/projected/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-kube-api-access-24g5g\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.690450 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fpvd\" (UniqueName: \"kubernetes.io/projected/edbe510d-bcd7-465b-82e6-8425666a3dae-kube-api-access-5fpvd\") pod \"nmstate-metrics-9b8c8685d-j25jd\" (UID: \"edbe510d-bcd7-465b-82e6-8425666a3dae\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.714728 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760267 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9c5966-5322-42c8-b89d-939904508cbf-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760328 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-nmstate-lock\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760358 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc9c5966-5322-42c8-b89d-939904508cbf-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760414 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gvwdz\" (UniqueName: \"kubernetes.io/projected/cc9c5966-5322-42c8-b89d-939904508cbf-kube-api-access-gvwdz\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760441 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-dbus-socket\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760489 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-ovs-socket\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760513 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flr85\" (UniqueName: \"kubernetes.io/projected/7462bd93-791f-45b3-943b-9c5ebfdf90ee-kube-api-access-flr85\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760760 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-nmstate-lock\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.760951 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-ovs-socket\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.761107 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/7462bd93-791f-45b3-943b-9c5ebfdf90ee-dbus-socket\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.761625 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/cc9c5966-5322-42c8-b89d-939904508cbf-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.768081 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/cc9c5966-5322-42c8-b89d-939904508cbf-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.771235 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-b979c7c44-xn5mv"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.772004 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.789013 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b979c7c44-xn5mv"] Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.792711 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gvwdz\" (UniqueName: \"kubernetes.io/projected/cc9c5966-5322-42c8-b89d-939904508cbf-kube-api-access-gvwdz\") pod \"nmstate-console-plugin-86f58fcf4-444cw\" (UID: \"cc9c5966-5322-42c8-b89d-939904508cbf\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.793332 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flr85\" (UniqueName: \"kubernetes.io/projected/7462bd93-791f-45b3-943b-9c5ebfdf90ee-kube-api-access-flr85\") pod \"nmstate-handler-jtx5n\" (UID: \"7462bd93-791f-45b3-943b-9c5ebfdf90ee\") " pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.872707 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963006 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cdbvm\" (UniqueName: \"kubernetes.io/projected/2944d943-67d5-4cb8-a853-d0797e7c0729-kube-api-access-cdbvm\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-console-config\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963094 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2944d943-67d5-4cb8-a853-d0797e7c0729-console-oauth-config\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963131 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-service-ca\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963164 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2944d943-67d5-4cb8-a853-d0797e7c0729-console-serving-cert\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963216 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-trusted-ca-bundle\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:34 crc kubenswrapper[4958]: I0320 09:14:34.963242 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-oauth-serving-cert\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065036 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cdbvm\" (UniqueName: \"kubernetes.io/projected/2944d943-67d5-4cb8-a853-d0797e7c0729-kube-api-access-cdbvm\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065099 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-console-config\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065136 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2944d943-67d5-4cb8-a853-d0797e7c0729-console-oauth-config\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065174 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-service-ca\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065200 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2944d943-67d5-4cb8-a853-d0797e7c0729-console-serving-cert\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065249 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-trusted-ca-bundle\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.065271 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-oauth-serving-cert\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.066994 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-console-config\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.067028 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-trusted-ca-bundle\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.067237 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-service-ca\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.067297 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/2944d943-67d5-4cb8-a853-d0797e7c0729-oauth-serving-cert\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.073233 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/2944d943-67d5-4cb8-a853-d0797e7c0729-console-oauth-config\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.073254 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/2944d943-67d5-4cb8-a853-d0797e7c0729-console-serving-cert\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.081652 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cdbvm\" (UniqueName: \"kubernetes.io/projected/2944d943-67d5-4cb8-a853-d0797e7c0729-kube-api-access-cdbvm\") pod \"console-b979c7c44-xn5mv\" (UID: \"2944d943-67d5-4cb8-a853-d0797e7c0729\") " pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.083668 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.093942 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw"] Mar 20 09:14:35 crc kubenswrapper[4958]: W0320 09:14:35.104218 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7462bd93_791f_45b3_943b_9c5ebfdf90ee.slice/crio-26be4879e81471bc84d4decf2bc85da8ed8ff7a56a004abb053f866dc29e514e WatchSource:0}: Error finding container 26be4879e81471bc84d4decf2bc85da8ed8ff7a56a004abb053f866dc29e514e: Status 404 returned error can't find the container with id 26be4879e81471bc84d4decf2bc85da8ed8ff7a56a004abb053f866dc29e514e Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.126585 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.169523 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.176738 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/6c6f8675-4ddc-4254-ae04-40cd4b5199d6-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-kqv85\" (UID: \"6c6f8675-4ddc-4254-ae04-40cd4b5199d6\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.186120 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd"] Mar 20 09:14:35 crc kubenswrapper[4958]: W0320 09:14:35.192162 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedbe510d_bcd7_465b_82e6_8425666a3dae.slice/crio-cb62da06cfe47909984e60754180c748493c8c419233176768518343aebd9944 WatchSource:0}: Error finding container cb62da06cfe47909984e60754180c748493c8c419233176768518343aebd9944: Status 404 returned error can't find the container with id cb62da06cfe47909984e60754180c748493c8c419233176768518343aebd9944 Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.323095 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.335234 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-b979c7c44-xn5mv"] Mar 20 09:14:35 crc kubenswrapper[4958]: W0320 09:14:35.342681 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2944d943_67d5_4cb8_a853_d0797e7c0729.slice/crio-2d55bbf17708d1628ef6c70ba379bae22f74f8c0e9e31ae70188d3d94e7f24bc WatchSource:0}: Error finding container 2d55bbf17708d1628ef6c70ba379bae22f74f8c0e9e31ae70188d3d94e7f24bc: Status 404 returned error can't find the container with id 2d55bbf17708d1628ef6c70ba379bae22f74f8c0e9e31ae70188d3d94e7f24bc Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.526931 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-kqv85"] Mar 20 09:14:35 crc kubenswrapper[4958]: W0320 09:14:35.536175 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6c6f8675_4ddc_4254_ae04_40cd4b5199d6.slice/crio-1359ebb345460b56ef8984ad889a2fc34dacb4645ef9e5ba395b213ca7808df2 WatchSource:0}: Error finding container 1359ebb345460b56ef8984ad889a2fc34dacb4645ef9e5ba395b213ca7808df2: Status 404 returned error can't find the container with id 1359ebb345460b56ef8984ad889a2fc34dacb4645ef9e5ba395b213ca7808df2 Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.999750 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b979c7c44-xn5mv" event={"ID":"2944d943-67d5-4cb8-a853-d0797e7c0729","Type":"ContainerStarted","Data":"c8b2051665fa7215ea085311d5e54afe768a124a1557790b3eafcce419d6aebf"} Mar 20 09:14:35 crc kubenswrapper[4958]: I0320 09:14:35.999873 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-b979c7c44-xn5mv" event={"ID":"2944d943-67d5-4cb8-a853-d0797e7c0729","Type":"ContainerStarted","Data":"2d55bbf17708d1628ef6c70ba379bae22f74f8c0e9e31ae70188d3d94e7f24bc"} Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.001791 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" event={"ID":"cc9c5966-5322-42c8-b89d-939904508cbf","Type":"ContainerStarted","Data":"800f6bdaa1c2785a4621cb5c5ae678b0d4855faf0efda594b2af07dca9d474bb"} Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.003760 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" event={"ID":"6c6f8675-4ddc-4254-ae04-40cd4b5199d6","Type":"ContainerStarted","Data":"1359ebb345460b56ef8984ad889a2fc34dacb4645ef9e5ba395b213ca7808df2"} Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.005635 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" event={"ID":"edbe510d-bcd7-465b-82e6-8425666a3dae","Type":"ContainerStarted","Data":"cb62da06cfe47909984e60754180c748493c8c419233176768518343aebd9944"} Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.007035 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jtx5n" event={"ID":"7462bd93-791f-45b3-943b-9c5ebfdf90ee","Type":"ContainerStarted","Data":"26be4879e81471bc84d4decf2bc85da8ed8ff7a56a004abb053f866dc29e514e"} Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.032432 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-b979c7c44-xn5mv" podStartSLOduration=2.032398505 podStartE2EDuration="2.032398505s" podCreationTimestamp="2026-03-20 09:14:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:14:36.027352236 +0000 UTC m=+896.349368194" watchObservedRunningTime="2026-03-20 09:14:36.032398505 +0000 UTC m=+896.354414503" Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.907519 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:36 crc kubenswrapper[4958]: I0320 09:14:36.961338 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:37 crc kubenswrapper[4958]: I0320 09:14:37.140604 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnsmc"] Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.028084 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-tnsmc" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="registry-server" containerID="cri-o://a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59" gracePeriod=2 Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.631805 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.723801 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6hscz\" (UniqueName: \"kubernetes.io/projected/62d051f7-25af-4220-9753-263ac96a9e67-kube-api-access-6hscz\") pod \"62d051f7-25af-4220-9753-263ac96a9e67\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.723880 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-utilities\") pod \"62d051f7-25af-4220-9753-263ac96a9e67\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.723989 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-catalog-content\") pod \"62d051f7-25af-4220-9753-263ac96a9e67\" (UID: \"62d051f7-25af-4220-9753-263ac96a9e67\") " Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.725335 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-utilities" (OuterVolumeSpecName: "utilities") pod "62d051f7-25af-4220-9753-263ac96a9e67" (UID: "62d051f7-25af-4220-9753-263ac96a9e67"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.728850 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/62d051f7-25af-4220-9753-263ac96a9e67-kube-api-access-6hscz" (OuterVolumeSpecName: "kube-api-access-6hscz") pod "62d051f7-25af-4220-9753-263ac96a9e67" (UID: "62d051f7-25af-4220-9753-263ac96a9e67"). InnerVolumeSpecName "kube-api-access-6hscz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.826146 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6hscz\" (UniqueName: \"kubernetes.io/projected/62d051f7-25af-4220-9753-263ac96a9e67-kube-api-access-6hscz\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.826187 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.863046 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "62d051f7-25af-4220-9753-263ac96a9e67" (UID: "62d051f7-25af-4220-9753-263ac96a9e67"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:14:38 crc kubenswrapper[4958]: I0320 09:14:38.926647 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/62d051f7-25af-4220-9753-263ac96a9e67-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.036106 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" event={"ID":"edbe510d-bcd7-465b-82e6-8425666a3dae","Type":"ContainerStarted","Data":"923c27a06e68796898617c549b82b549824628cb0b88c67664be7e8bcab605e6"} Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.040657 4958 generic.go:334] "Generic (PLEG): container finished" podID="62d051f7-25af-4220-9753-263ac96a9e67" containerID="a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59" exitCode=0 Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.040753 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerDied","Data":"a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59"} Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.040861 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-tnsmc" event={"ID":"62d051f7-25af-4220-9753-263ac96a9e67","Type":"ContainerDied","Data":"e1e739be6b0aa725cc0afe43c805fe8e0d05679c728dc214d9a2b71b31e262fc"} Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.040804 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-tnsmc" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.040948 4958 scope.go:117] "RemoveContainer" containerID="a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.042682 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" event={"ID":"cc9c5966-5322-42c8-b89d-939904508cbf","Type":"ContainerStarted","Data":"d6e4ef666b0e567d61f364c508cd51a47a4bb9aafed7277038d160220cc30abd"} Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.070824 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-444cw" podStartSLOduration=1.520671504 podStartE2EDuration="5.07079538s" podCreationTimestamp="2026-03-20 09:14:34 +0000 UTC" firstStartedPulling="2026-03-20 09:14:35.105403385 +0000 UTC m=+895.427419343" lastFinishedPulling="2026-03-20 09:14:38.655527261 +0000 UTC m=+898.977543219" observedRunningTime="2026-03-20 09:14:39.06790026 +0000 UTC m=+899.389916218" watchObservedRunningTime="2026-03-20 09:14:39.07079538 +0000 UTC m=+899.392811338" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.076841 4958 scope.go:117] "RemoveContainer" containerID="6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.085291 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-tnsmc"] Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.091691 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-tnsmc"] Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.110750 4958 scope.go:117] "RemoveContainer" containerID="733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.128355 4958 scope.go:117] "RemoveContainer" containerID="a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59" Mar 20 09:14:39 crc kubenswrapper[4958]: E0320 09:14:39.128787 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59\": container with ID starting with a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59 not found: ID does not exist" containerID="a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.128824 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59"} err="failed to get container status \"a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59\": rpc error: code = NotFound desc = could not find container \"a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59\": container with ID starting with a92afd764b68983a60c72de4d1e90abe099c106058d9d7e67e26382aa1f26b59 not found: ID does not exist" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.128851 4958 scope.go:117] "RemoveContainer" containerID="6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33" Mar 20 09:14:39 crc kubenswrapper[4958]: E0320 09:14:39.129091 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33\": container with ID starting with 6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33 not found: ID does not exist" containerID="6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.129124 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33"} err="failed to get container status \"6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33\": rpc error: code = NotFound desc = could not find container \"6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33\": container with ID starting with 6c426b3229c3eeba31718bc0f25f37bee1e4c7bd4c13cdd8adb6bf611986ce33 not found: ID does not exist" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.129146 4958 scope.go:117] "RemoveContainer" containerID="733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b" Mar 20 09:14:39 crc kubenswrapper[4958]: E0320 09:14:39.129510 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b\": container with ID starting with 733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b not found: ID does not exist" containerID="733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b" Mar 20 09:14:39 crc kubenswrapper[4958]: I0320 09:14:39.129540 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b"} err="failed to get container status \"733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b\": rpc error: code = NotFound desc = could not find container \"733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b\": container with ID starting with 733e76386c63a38b025552426c5c5f835166b65090951d622e5e2c31fe23a96b not found: ID does not exist" Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.053921 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" event={"ID":"6c6f8675-4ddc-4254-ae04-40cd4b5199d6","Type":"ContainerStarted","Data":"504894128a79f7dba755522cf9a1960692cc55fa455f4f6d9a5b070955398ae5"} Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.054381 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.058198 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-jtx5n" event={"ID":"7462bd93-791f-45b3-943b-9c5ebfdf90ee","Type":"ContainerStarted","Data":"44b1ab73a1a3ebef17f1e48458afa3b66930233b80291ca140a3ae419d3d146b"} Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.058892 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.097308 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" podStartSLOduration=2.684588303 podStartE2EDuration="6.097267435s" podCreationTimestamp="2026-03-20 09:14:34 +0000 UTC" firstStartedPulling="2026-03-20 09:14:35.539236027 +0000 UTC m=+895.861251985" lastFinishedPulling="2026-03-20 09:14:38.951915159 +0000 UTC m=+899.273931117" observedRunningTime="2026-03-20 09:14:40.077686685 +0000 UTC m=+900.399702653" watchObservedRunningTime="2026-03-20 09:14:40.097267435 +0000 UTC m=+900.419283423" Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.122379 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-jtx5n" podStartSLOduration=2.327926961 podStartE2EDuration="6.122355528s" podCreationTimestamp="2026-03-20 09:14:34 +0000 UTC" firstStartedPulling="2026-03-20 09:14:35.107067331 +0000 UTC m=+895.429083289" lastFinishedPulling="2026-03-20 09:14:38.901495898 +0000 UTC m=+899.223511856" observedRunningTime="2026-03-20 09:14:40.10394315 +0000 UTC m=+900.425959138" watchObservedRunningTime="2026-03-20 09:14:40.122355528 +0000 UTC m=+900.444371486" Mar 20 09:14:40 crc kubenswrapper[4958]: I0320 09:14:40.454694 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="62d051f7-25af-4220-9753-263ac96a9e67" path="/var/lib/kubelet/pods/62d051f7-25af-4220-9753-263ac96a9e67/volumes" Mar 20 09:14:42 crc kubenswrapper[4958]: I0320 09:14:42.077723 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" event={"ID":"edbe510d-bcd7-465b-82e6-8425666a3dae","Type":"ContainerStarted","Data":"81caa5d39d01a8b8d56707d4bdf0cac832009171243872586e7c552ba68446af"} Mar 20 09:14:42 crc kubenswrapper[4958]: I0320 09:14:42.099038 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-j25jd" podStartSLOduration=1.687977901 podStartE2EDuration="8.099012234s" podCreationTimestamp="2026-03-20 09:14:34 +0000 UTC" firstStartedPulling="2026-03-20 09:14:35.195549812 +0000 UTC m=+895.517565770" lastFinishedPulling="2026-03-20 09:14:41.606584105 +0000 UTC m=+901.928600103" observedRunningTime="2026-03-20 09:14:42.097462621 +0000 UTC m=+902.419478579" watchObservedRunningTime="2026-03-20 09:14:42.099012234 +0000 UTC m=+902.421028212" Mar 20 09:14:43 crc kubenswrapper[4958]: I0320 09:14:43.015785 4958 scope.go:117] "RemoveContainer" containerID="8c4d4f89fc944bca692270c70c54a731a779528750f7c103e3d829a11a136518" Mar 20 09:14:45 crc kubenswrapper[4958]: I0320 09:14:45.107694 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-jtx5n" Mar 20 09:14:45 crc kubenswrapper[4958]: I0320 09:14:45.127736 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:45 crc kubenswrapper[4958]: I0320 09:14:45.127796 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:45 crc kubenswrapper[4958]: I0320 09:14:45.135729 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:46 crc kubenswrapper[4958]: I0320 09:14:46.105644 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-b979c7c44-xn5mv" Mar 20 09:14:46 crc kubenswrapper[4958]: I0320 09:14:46.174019 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hrxfl"] Mar 20 09:14:55 crc kubenswrapper[4958]: I0320 09:14:55.330216 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-kqv85" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.151202 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt"] Mar 20 09:15:00 crc kubenswrapper[4958]: E0320 09:15:00.152109 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="extract-utilities" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.152130 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="extract-utilities" Mar 20 09:15:00 crc kubenswrapper[4958]: E0320 09:15:00.152144 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="extract-content" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.152150 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="extract-content" Mar 20 09:15:00 crc kubenswrapper[4958]: E0320 09:15:00.152172 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="registry-server" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.152179 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="registry-server" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.152321 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="62d051f7-25af-4220-9753-263ac96a9e67" containerName="registry-server" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.152935 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.158186 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-secret-volume\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.158256 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fkgd\" (UniqueName: \"kubernetes.io/projected/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-kube-api-access-7fkgd\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.158299 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-config-volume\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.160051 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.160050 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.160226 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt"] Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.259980 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-config-volume\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.260418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-secret-volume\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.260572 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fkgd\" (UniqueName: \"kubernetes.io/projected/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-kube-api-access-7fkgd\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.261079 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-config-volume\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.268273 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-secret-volume\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.278405 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fkgd\" (UniqueName: \"kubernetes.io/projected/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-kube-api-access-7fkgd\") pod \"collect-profiles-29566635-cd2rt\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.477184 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:00 crc kubenswrapper[4958]: I0320 09:15:00.676614 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt"] Mar 20 09:15:01 crc kubenswrapper[4958]: I0320 09:15:01.223167 4958 generic.go:334] "Generic (PLEG): container finished" podID="1138fc0d-0fcb-449f-89ba-b92e2dc54c94" containerID="f07afb39d5b3687f76b76f0a736c1d7764a631d17c93fc963391bb66aabf25ea" exitCode=0 Mar 20 09:15:01 crc kubenswrapper[4958]: I0320 09:15:01.223228 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" event={"ID":"1138fc0d-0fcb-449f-89ba-b92e2dc54c94","Type":"ContainerDied","Data":"f07afb39d5b3687f76b76f0a736c1d7764a631d17c93fc963391bb66aabf25ea"} Mar 20 09:15:01 crc kubenswrapper[4958]: I0320 09:15:01.223265 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" event={"ID":"1138fc0d-0fcb-449f-89ba-b92e2dc54c94","Type":"ContainerStarted","Data":"ae741ccea655ac8b0daa5d146cb52cb19506369e7a59b198744d2c454cb806e5"} Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.468111 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.502278 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-config-volume\") pod \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.502390 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-secret-volume\") pod \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.502534 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fkgd\" (UniqueName: \"kubernetes.io/projected/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-kube-api-access-7fkgd\") pod \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\" (UID: \"1138fc0d-0fcb-449f-89ba-b92e2dc54c94\") " Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.504758 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-config-volume" (OuterVolumeSpecName: "config-volume") pod "1138fc0d-0fcb-449f-89ba-b92e2dc54c94" (UID: "1138fc0d-0fcb-449f-89ba-b92e2dc54c94"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.511623 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1138fc0d-0fcb-449f-89ba-b92e2dc54c94" (UID: "1138fc0d-0fcb-449f-89ba-b92e2dc54c94"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.512799 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-kube-api-access-7fkgd" (OuterVolumeSpecName: "kube-api-access-7fkgd") pod "1138fc0d-0fcb-449f-89ba-b92e2dc54c94" (UID: "1138fc0d-0fcb-449f-89ba-b92e2dc54c94"). InnerVolumeSpecName "kube-api-access-7fkgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.604366 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.604409 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:02 crc kubenswrapper[4958]: I0320 09:15:02.604418 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7fkgd\" (UniqueName: \"kubernetes.io/projected/1138fc0d-0fcb-449f-89ba-b92e2dc54c94-kube-api-access-7fkgd\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:03 crc kubenswrapper[4958]: I0320 09:15:03.240995 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" event={"ID":"1138fc0d-0fcb-449f-89ba-b92e2dc54c94","Type":"ContainerDied","Data":"ae741ccea655ac8b0daa5d146cb52cb19506369e7a59b198744d2c454cb806e5"} Mar 20 09:15:03 crc kubenswrapper[4958]: I0320 09:15:03.241520 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ae741ccea655ac8b0daa5d146cb52cb19506369e7a59b198744d2c454cb806e5" Mar 20 09:15:03 crc kubenswrapper[4958]: I0320 09:15:03.241122 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566635-cd2rt" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.615574 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b"] Mar 20 09:15:10 crc kubenswrapper[4958]: E0320 09:15:10.616701 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1138fc0d-0fcb-449f-89ba-b92e2dc54c94" containerName="collect-profiles" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.616719 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="1138fc0d-0fcb-449f-89ba-b92e2dc54c94" containerName="collect-profiles" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.616866 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="1138fc0d-0fcb-449f-89ba-b92e2dc54c94" containerName="collect-profiles" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.617741 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.620254 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.636397 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b"] Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.722431 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.722575 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blnxx\" (UniqueName: \"kubernetes.io/projected/2f5ce30c-74f6-431c-9df1-32530fdc4ade-kube-api-access-blnxx\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.722646 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.823573 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.823707 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.823753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blnxx\" (UniqueName: \"kubernetes.io/projected/2f5ce30c-74f6-431c-9df1-32530fdc4ade-kube-api-access-blnxx\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.824442 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.824499 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.843120 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blnxx\" (UniqueName: \"kubernetes.io/projected/2f5ce30c-74f6-431c-9df1-32530fdc4ade-kube-api-access-blnxx\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:10 crc kubenswrapper[4958]: I0320 09:15:10.947323 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.184933 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b"] Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.222710 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-hrxfl" podUID="460baf6e-b4fd-4f68-804b-86d4767241d1" containerName="console" containerID="cri-o://01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce" gracePeriod=15 Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.318232 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" event={"ID":"2f5ce30c-74f6-431c-9df1-32530fdc4ade","Type":"ContainerStarted","Data":"a18677e4961e5d972d63007a1862a1c93a7070a2e0b23fe8f11934ae89dfacd3"} Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.587975 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hrxfl_460baf6e-b4fd-4f68-804b-86d4767241d1/console/0.log" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.588519 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.741192 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6gwdx\" (UniqueName: \"kubernetes.io/projected/460baf6e-b4fd-4f68-804b-86d4767241d1-kube-api-access-6gwdx\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.741858 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-oauth-config\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.741986 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-service-ca\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.742015 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-oauth-serving-cert\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.742048 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-console-config\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.742069 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-serving-cert\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.742125 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-trusted-ca-bundle\") pod \"460baf6e-b4fd-4f68-804b-86d4767241d1\" (UID: \"460baf6e-b4fd-4f68-804b-86d4767241d1\") " Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.743196 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.743217 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-console-config" (OuterVolumeSpecName: "console-config") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.743294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.743778 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-service-ca" (OuterVolumeSpecName: "service-ca") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.749036 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.749156 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/460baf6e-b4fd-4f68-804b-86d4767241d1-kube-api-access-6gwdx" (OuterVolumeSpecName: "kube-api-access-6gwdx") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "kube-api-access-6gwdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.749304 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "460baf6e-b4fd-4f68-804b-86d4767241d1" (UID: "460baf6e-b4fd-4f68-804b-86d4767241d1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.842954 4958 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.843015 4958 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.843025 4958 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.843036 4958 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.843044 4958 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/460baf6e-b4fd-4f68-804b-86d4767241d1-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.843053 4958 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/460baf6e-b4fd-4f68-804b-86d4767241d1-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:11 crc kubenswrapper[4958]: I0320 09:15:11.843064 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6gwdx\" (UniqueName: \"kubernetes.io/projected/460baf6e-b4fd-4f68-804b-86d4767241d1-kube-api-access-6gwdx\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.325620 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerID="501255f82726c3c97e1198d90d392b569b88ad25372f3e05c6d66c4c1ad1ced0" exitCode=0 Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.325672 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" event={"ID":"2f5ce30c-74f6-431c-9df1-32530fdc4ade","Type":"ContainerDied","Data":"501255f82726c3c97e1198d90d392b569b88ad25372f3e05c6d66c4c1ad1ced0"} Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.327468 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-hrxfl_460baf6e-b4fd-4f68-804b-86d4767241d1/console/0.log" Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.327501 4958 generic.go:334] "Generic (PLEG): container finished" podID="460baf6e-b4fd-4f68-804b-86d4767241d1" containerID="01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce" exitCode=2 Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.327525 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hrxfl" event={"ID":"460baf6e-b4fd-4f68-804b-86d4767241d1","Type":"ContainerDied","Data":"01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce"} Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.327547 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-hrxfl" event={"ID":"460baf6e-b4fd-4f68-804b-86d4767241d1","Type":"ContainerDied","Data":"f00415f5e6083c444597746260f452c5d13d3b01e4c601e45a8f5d505dbf5164"} Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.327569 4958 scope.go:117] "RemoveContainer" containerID="01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce" Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.327643 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-hrxfl" Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.367751 4958 scope.go:117] "RemoveContainer" containerID="01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce" Mar 20 09:15:12 crc kubenswrapper[4958]: E0320 09:15:12.369467 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce\": container with ID starting with 01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce not found: ID does not exist" containerID="01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce" Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.369636 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce"} err="failed to get container status \"01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce\": rpc error: code = NotFound desc = could not find container \"01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce\": container with ID starting with 01f773b46c22e4842434ea89cdef4803d16043ccc1877f03067fdf31b1da9bce not found: ID does not exist" Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.376446 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-hrxfl"] Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.380834 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-hrxfl"] Mar 20 09:15:12 crc kubenswrapper[4958]: I0320 09:15:12.442126 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="460baf6e-b4fd-4f68-804b-86d4767241d1" path="/var/lib/kubelet/pods/460baf6e-b4fd-4f68-804b-86d4767241d1/volumes" Mar 20 09:15:14 crc kubenswrapper[4958]: I0320 09:15:14.348103 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" event={"ID":"2f5ce30c-74f6-431c-9df1-32530fdc4ade","Type":"ContainerStarted","Data":"9cd09ffa86fc5d896de525648deaf12e72a43f87ca72d8b798e5172e7c23728a"} Mar 20 09:15:15 crc kubenswrapper[4958]: I0320 09:15:15.360302 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerID="9cd09ffa86fc5d896de525648deaf12e72a43f87ca72d8b798e5172e7c23728a" exitCode=0 Mar 20 09:15:15 crc kubenswrapper[4958]: I0320 09:15:15.360448 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" event={"ID":"2f5ce30c-74f6-431c-9df1-32530fdc4ade","Type":"ContainerDied","Data":"9cd09ffa86fc5d896de525648deaf12e72a43f87ca72d8b798e5172e7c23728a"} Mar 20 09:15:16 crc kubenswrapper[4958]: I0320 09:15:16.371214 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerID="02692819a8932633de5d6f6f59267f940bf1eb19a931217b0891c3fcdf79a737" exitCode=0 Mar 20 09:15:16 crc kubenswrapper[4958]: I0320 09:15:16.371288 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" event={"ID":"2f5ce30c-74f6-431c-9df1-32530fdc4ade","Type":"ContainerDied","Data":"02692819a8932633de5d6f6f59267f940bf1eb19a931217b0891c3fcdf79a737"} Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.683101 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.847340 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blnxx\" (UniqueName: \"kubernetes.io/projected/2f5ce30c-74f6-431c-9df1-32530fdc4ade-kube-api-access-blnxx\") pod \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.847404 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-bundle\") pod \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.848063 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-util\") pod \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\" (UID: \"2f5ce30c-74f6-431c-9df1-32530fdc4ade\") " Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.848575 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-bundle" (OuterVolumeSpecName: "bundle") pod "2f5ce30c-74f6-431c-9df1-32530fdc4ade" (UID: "2f5ce30c-74f6-431c-9df1-32530fdc4ade"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.853483 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f5ce30c-74f6-431c-9df1-32530fdc4ade-kube-api-access-blnxx" (OuterVolumeSpecName: "kube-api-access-blnxx") pod "2f5ce30c-74f6-431c-9df1-32530fdc4ade" (UID: "2f5ce30c-74f6-431c-9df1-32530fdc4ade"). InnerVolumeSpecName "kube-api-access-blnxx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.865238 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-util" (OuterVolumeSpecName: "util") pod "2f5ce30c-74f6-431c-9df1-32530fdc4ade" (UID: "2f5ce30c-74f6-431c-9df1-32530fdc4ade"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.949126 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-util\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.949173 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blnxx\" (UniqueName: \"kubernetes.io/projected/2f5ce30c-74f6-431c-9df1-32530fdc4ade-kube-api-access-blnxx\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:17 crc kubenswrapper[4958]: I0320 09:15:17.949188 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2f5ce30c-74f6-431c-9df1-32530fdc4ade-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:15:18 crc kubenswrapper[4958]: I0320 09:15:18.390016 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" event={"ID":"2f5ce30c-74f6-431c-9df1-32530fdc4ade","Type":"ContainerDied","Data":"a18677e4961e5d972d63007a1862a1c93a7070a2e0b23fe8f11934ae89dfacd3"} Mar 20 09:15:18 crc kubenswrapper[4958]: I0320 09:15:18.390478 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a18677e4961e5d972d63007a1862a1c93a7070a2e0b23fe8f11934ae89dfacd3" Mar 20 09:15:18 crc kubenswrapper[4958]: I0320 09:15:18.390057 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b" Mar 20 09:15:26 crc kubenswrapper[4958]: I0320 09:15:26.521747 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:15:26 crc kubenswrapper[4958]: I0320 09:15:26.522646 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.677373 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf"] Mar 20 09:15:29 crc kubenswrapper[4958]: E0320 09:15:29.678301 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="pull" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.678319 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="pull" Mar 20 09:15:29 crc kubenswrapper[4958]: E0320 09:15:29.678335 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="extract" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.678344 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="extract" Mar 20 09:15:29 crc kubenswrapper[4958]: E0320 09:15:29.678371 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="util" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.678379 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="util" Mar 20 09:15:29 crc kubenswrapper[4958]: E0320 09:15:29.678393 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="460baf6e-b4fd-4f68-804b-86d4767241d1" containerName="console" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.678400 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="460baf6e-b4fd-4f68-804b-86d4767241d1" containerName="console" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.678581 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f5ce30c-74f6-431c-9df1-32530fdc4ade" containerName="extract" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.678615 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="460baf6e-b4fd-4f68-804b-86d4767241d1" containerName="console" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.679151 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.681879 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.682151 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-zhr2x" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.682738 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.683581 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.683745 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.703501 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf"] Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.813910 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c1c0a68d-5950-4e09-a7e9-918863cf2008-apiservice-cert\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.814061 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4gf7\" (UniqueName: \"kubernetes.io/projected/c1c0a68d-5950-4e09-a7e9-918863cf2008-kube-api-access-v4gf7\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.814147 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c1c0a68d-5950-4e09-a7e9-918863cf2008-webhook-cert\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.915187 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c1c0a68d-5950-4e09-a7e9-918863cf2008-apiservice-cert\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.916130 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4gf7\" (UniqueName: \"kubernetes.io/projected/c1c0a68d-5950-4e09-a7e9-918863cf2008-kube-api-access-v4gf7\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.916190 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c1c0a68d-5950-4e09-a7e9-918863cf2008-webhook-cert\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.926382 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c1c0a68d-5950-4e09-a7e9-918863cf2008-apiservice-cert\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.934216 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c1c0a68d-5950-4e09-a7e9-918863cf2008-webhook-cert\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.948412 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6"] Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.949548 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.955935 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.956139 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.956175 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-57d5l" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.956986 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4gf7\" (UniqueName: \"kubernetes.io/projected/c1c0a68d-5950-4e09-a7e9-918863cf2008-kube-api-access-v4gf7\") pod \"metallb-operator-controller-manager-65b48c4558-h8dcf\" (UID: \"c1c0a68d-5950-4e09-a7e9-918863cf2008\") " pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.965388 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6"] Mar 20 09:15:29 crc kubenswrapper[4958]: I0320 09:15:29.995645 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.017040 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdf4b931-9e36-44d0-b69b-7156d89875d9-webhook-cert\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.017096 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdf4b931-9e36-44d0-b69b-7156d89875d9-apiservice-cert\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.017142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rpggf\" (UniqueName: \"kubernetes.io/projected/fdf4b931-9e36-44d0-b69b-7156d89875d9-kube-api-access-rpggf\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.118185 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdf4b931-9e36-44d0-b69b-7156d89875d9-webhook-cert\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.118574 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdf4b931-9e36-44d0-b69b-7156d89875d9-apiservice-cert\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.118659 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rpggf\" (UniqueName: \"kubernetes.io/projected/fdf4b931-9e36-44d0-b69b-7156d89875d9-kube-api-access-rpggf\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.127831 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fdf4b931-9e36-44d0-b69b-7156d89875d9-apiservice-cert\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.131999 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fdf4b931-9e36-44d0-b69b-7156d89875d9-webhook-cert\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.143689 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rpggf\" (UniqueName: \"kubernetes.io/projected/fdf4b931-9e36-44d0-b69b-7156d89875d9-kube-api-access-rpggf\") pod \"metallb-operator-webhook-server-79b7b75cdf-mmtj6\" (UID: \"fdf4b931-9e36-44d0-b69b-7156d89875d9\") " pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.260343 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf"] Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.303343 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.484420 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" event={"ID":"c1c0a68d-5950-4e09-a7e9-918863cf2008","Type":"ContainerStarted","Data":"5214a5732a806cb43827ef03659b92abe4416daa483bec82213035fd3d46f39e"} Mar 20 09:15:30 crc kubenswrapper[4958]: I0320 09:15:30.546815 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6"] Mar 20 09:15:30 crc kubenswrapper[4958]: W0320 09:15:30.563902 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfdf4b931_9e36_44d0_b69b_7156d89875d9.slice/crio-1ced4744f7a3c7825205c0446242cc90674e70889e34d150e5ff8018dba5259f WatchSource:0}: Error finding container 1ced4744f7a3c7825205c0446242cc90674e70889e34d150e5ff8018dba5259f: Status 404 returned error can't find the container with id 1ced4744f7a3c7825205c0446242cc90674e70889e34d150e5ff8018dba5259f Mar 20 09:15:31 crc kubenswrapper[4958]: I0320 09:15:31.491092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" event={"ID":"fdf4b931-9e36-44d0-b69b-7156d89875d9","Type":"ContainerStarted","Data":"1ced4744f7a3c7825205c0446242cc90674e70889e34d150e5ff8018dba5259f"} Mar 20 09:15:35 crc kubenswrapper[4958]: I0320 09:15:35.521791 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" event={"ID":"c1c0a68d-5950-4e09-a7e9-918863cf2008","Type":"ContainerStarted","Data":"09c00db5bd8be33b41b4956e01694659cbe5a507a84cc881c9b444f17747cf10"} Mar 20 09:15:35 crc kubenswrapper[4958]: I0320 09:15:35.524643 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:15:35 crc kubenswrapper[4958]: I0320 09:15:35.555304 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" podStartSLOduration=1.931228907 podStartE2EDuration="6.555278578s" podCreationTimestamp="2026-03-20 09:15:29 +0000 UTC" firstStartedPulling="2026-03-20 09:15:30.271338426 +0000 UTC m=+950.593354374" lastFinishedPulling="2026-03-20 09:15:34.895388087 +0000 UTC m=+955.217404045" observedRunningTime="2026-03-20 09:15:35.543129652 +0000 UTC m=+955.865145630" watchObservedRunningTime="2026-03-20 09:15:35.555278578 +0000 UTC m=+955.877294536" Mar 20 09:15:37 crc kubenswrapper[4958]: I0320 09:15:37.540711 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" event={"ID":"fdf4b931-9e36-44d0-b69b-7156d89875d9","Type":"ContainerStarted","Data":"98cc16a0ec557fdc13aaaa48b01fb635d4603dc6979e2390f4aaa26a2d6f3dba"} Mar 20 09:15:37 crc kubenswrapper[4958]: I0320 09:15:37.541083 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:37 crc kubenswrapper[4958]: I0320 09:15:37.564763 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" podStartSLOduration=2.268902057 podStartE2EDuration="8.564741615s" podCreationTimestamp="2026-03-20 09:15:29 +0000 UTC" firstStartedPulling="2026-03-20 09:15:30.567852008 +0000 UTC m=+950.889867966" lastFinishedPulling="2026-03-20 09:15:36.863691566 +0000 UTC m=+957.185707524" observedRunningTime="2026-03-20 09:15:37.563574553 +0000 UTC m=+957.885590511" watchObservedRunningTime="2026-03-20 09:15:37.564741615 +0000 UTC m=+957.886757573" Mar 20 09:15:50 crc kubenswrapper[4958]: I0320 09:15:50.307933 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-79b7b75cdf-mmtj6" Mar 20 09:15:51 crc kubenswrapper[4958]: I0320 09:15:51.757492 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-jbgr7"] Mar 20 09:15:51 crc kubenswrapper[4958]: I0320 09:15:51.758855 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:51 crc kubenswrapper[4958]: I0320 09:15:51.768629 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbgr7"] Mar 20 09:15:51 crc kubenswrapper[4958]: I0320 09:15:51.907746 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-catalog-content\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:51 crc kubenswrapper[4958]: I0320 09:15:51.907838 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrzsc\" (UniqueName: \"kubernetes.io/projected/aee649fd-9006-4b86-9ef0-98eb482a70c4-kube-api-access-qrzsc\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:51 crc kubenswrapper[4958]: I0320 09:15:51.907912 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-utilities\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.009622 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-utilities\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.009724 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-catalog-content\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.009752 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrzsc\" (UniqueName: \"kubernetes.io/projected/aee649fd-9006-4b86-9ef0-98eb482a70c4-kube-api-access-qrzsc\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.010590 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-utilities\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.010867 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-catalog-content\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.033697 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrzsc\" (UniqueName: \"kubernetes.io/projected/aee649fd-9006-4b86-9ef0-98eb482a70c4-kube-api-access-qrzsc\") pod \"certified-operators-jbgr7\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.077715 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.576148 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-jbgr7"] Mar 20 09:15:52 crc kubenswrapper[4958]: W0320 09:15:52.592408 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee649fd_9006_4b86_9ef0_98eb482a70c4.slice/crio-e5d7b6325b2e31c385f57044b5193aa43b34b1cfab80487cf3e94d50ffe67f6b WatchSource:0}: Error finding container e5d7b6325b2e31c385f57044b5193aa43b34b1cfab80487cf3e94d50ffe67f6b: Status 404 returned error can't find the container with id e5d7b6325b2e31c385f57044b5193aa43b34b1cfab80487cf3e94d50ffe67f6b Mar 20 09:15:52 crc kubenswrapper[4958]: I0320 09:15:52.625840 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerStarted","Data":"e5d7b6325b2e31c385f57044b5193aa43b34b1cfab80487cf3e94d50ffe67f6b"} Mar 20 09:15:53 crc kubenswrapper[4958]: I0320 09:15:53.632787 4958 generic.go:334] "Generic (PLEG): container finished" podID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerID="02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3" exitCode=0 Mar 20 09:15:53 crc kubenswrapper[4958]: I0320 09:15:53.633132 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerDied","Data":"02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3"} Mar 20 09:15:54 crc kubenswrapper[4958]: I0320 09:15:54.640962 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerStarted","Data":"31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602"} Mar 20 09:15:55 crc kubenswrapper[4958]: I0320 09:15:55.651084 4958 generic.go:334] "Generic (PLEG): container finished" podID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerID="31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602" exitCode=0 Mar 20 09:15:55 crc kubenswrapper[4958]: I0320 09:15:55.651274 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerDied","Data":"31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602"} Mar 20 09:15:56 crc kubenswrapper[4958]: I0320 09:15:56.521346 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:15:56 crc kubenswrapper[4958]: I0320 09:15:56.521858 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:15:56 crc kubenswrapper[4958]: I0320 09:15:56.662176 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerStarted","Data":"5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62"} Mar 20 09:15:56 crc kubenswrapper[4958]: I0320 09:15:56.693832 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-jbgr7" podStartSLOduration=3.069569897 podStartE2EDuration="5.69380614s" podCreationTimestamp="2026-03-20 09:15:51 +0000 UTC" firstStartedPulling="2026-03-20 09:15:53.634318223 +0000 UTC m=+973.956334201" lastFinishedPulling="2026-03-20 09:15:56.258554496 +0000 UTC m=+976.580570444" observedRunningTime="2026-03-20 09:15:56.685378167 +0000 UTC m=+977.007394125" watchObservedRunningTime="2026-03-20 09:15:56.69380614 +0000 UTC m=+977.015822098" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.136746 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566636-54tzd"] Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.137652 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.141093 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.141098 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.141926 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.152167 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-54tzd"] Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.211142 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vx5sl\" (UniqueName: \"kubernetes.io/projected/add4ecff-63cc-486a-90ed-3e61f3c143ba-kube-api-access-vx5sl\") pod \"auto-csr-approver-29566636-54tzd\" (UID: \"add4ecff-63cc-486a-90ed-3e61f3c143ba\") " pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.312158 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vx5sl\" (UniqueName: \"kubernetes.io/projected/add4ecff-63cc-486a-90ed-3e61f3c143ba-kube-api-access-vx5sl\") pod \"auto-csr-approver-29566636-54tzd\" (UID: \"add4ecff-63cc-486a-90ed-3e61f3c143ba\") " pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.332295 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vx5sl\" (UniqueName: \"kubernetes.io/projected/add4ecff-63cc-486a-90ed-3e61f3c143ba-kube-api-access-vx5sl\") pod \"auto-csr-approver-29566636-54tzd\" (UID: \"add4ecff-63cc-486a-90ed-3e61f3c143ba\") " pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.454682 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:00 crc kubenswrapper[4958]: I0320 09:16:00.921760 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-54tzd"] Mar 20 09:16:01 crc kubenswrapper[4958]: I0320 09:16:01.709408 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-54tzd" event={"ID":"add4ecff-63cc-486a-90ed-3e61f3c143ba","Type":"ContainerStarted","Data":"fd1426ae85991f49e38def6fd66ca71589772d1101f8bb108abdfe8775de60cc"} Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.078963 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.079656 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.243150 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.716885 4958 generic.go:334] "Generic (PLEG): container finished" podID="add4ecff-63cc-486a-90ed-3e61f3c143ba" containerID="c263df7d94f23aa7486f8436bc1f644a5d1243f92e23f1d903429f26741ef1d6" exitCode=0 Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.716990 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-54tzd" event={"ID":"add4ecff-63cc-486a-90ed-3e61f3c143ba","Type":"ContainerDied","Data":"c263df7d94f23aa7486f8436bc1f644a5d1243f92e23f1d903429f26741ef1d6"} Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.761110 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:16:02 crc kubenswrapper[4958]: I0320 09:16:02.802807 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbgr7"] Mar 20 09:16:03 crc kubenswrapper[4958]: I0320 09:16:03.944877 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.101540 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vx5sl\" (UniqueName: \"kubernetes.io/projected/add4ecff-63cc-486a-90ed-3e61f3c143ba-kube-api-access-vx5sl\") pod \"add4ecff-63cc-486a-90ed-3e61f3c143ba\" (UID: \"add4ecff-63cc-486a-90ed-3e61f3c143ba\") " Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.110344 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/add4ecff-63cc-486a-90ed-3e61f3c143ba-kube-api-access-vx5sl" (OuterVolumeSpecName: "kube-api-access-vx5sl") pod "add4ecff-63cc-486a-90ed-3e61f3c143ba" (UID: "add4ecff-63cc-486a-90ed-3e61f3c143ba"). InnerVolumeSpecName "kube-api-access-vx5sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.205901 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vx5sl\" (UniqueName: \"kubernetes.io/projected/add4ecff-63cc-486a-90ed-3e61f3c143ba-kube-api-access-vx5sl\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.728417 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566636-54tzd" event={"ID":"add4ecff-63cc-486a-90ed-3e61f3c143ba","Type":"ContainerDied","Data":"fd1426ae85991f49e38def6fd66ca71589772d1101f8bb108abdfe8775de60cc"} Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.728499 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd1426ae85991f49e38def6fd66ca71589772d1101f8bb108abdfe8775de60cc" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.728435 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566636-54tzd" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.728958 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-jbgr7" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="registry-server" containerID="cri-o://5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62" gracePeriod=2 Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.891576 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xl58v"] Mar 20 09:16:04 crc kubenswrapper[4958]: E0320 09:16:04.894655 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="add4ecff-63cc-486a-90ed-3e61f3c143ba" containerName="oc" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.894724 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="add4ecff-63cc-486a-90ed-3e61f3c143ba" containerName="oc" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.894959 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="add4ecff-63cc-486a-90ed-3e61f3c143ba" containerName="oc" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.897043 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.903317 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xl58v"] Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.921565 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-utilities\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.921804 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-75sk6\" (UniqueName: \"kubernetes.io/projected/f94d4da7-b898-454c-b5d8-119bb782d6cb-kube-api-access-75sk6\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:04 crc kubenswrapper[4958]: I0320 09:16:04.921838 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-catalog-content\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.017811 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-bz4vf"] Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.022682 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-utilities\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.022772 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-75sk6\" (UniqueName: \"kubernetes.io/projected/f94d4da7-b898-454c-b5d8-119bb782d6cb-kube-api-access-75sk6\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.022802 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-catalog-content\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.023472 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-catalog-content\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.023460 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566630-bz4vf"] Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.023938 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-utilities\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.060342 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-75sk6\" (UniqueName: \"kubernetes.io/projected/f94d4da7-b898-454c-b5d8-119bb782d6cb-kube-api-access-75sk6\") pod \"community-operators-xl58v\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.141473 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.212480 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.325039 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-utilities\") pod \"aee649fd-9006-4b86-9ef0-98eb482a70c4\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.325396 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-catalog-content\") pod \"aee649fd-9006-4b86-9ef0-98eb482a70c4\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.325458 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrzsc\" (UniqueName: \"kubernetes.io/projected/aee649fd-9006-4b86-9ef0-98eb482a70c4-kube-api-access-qrzsc\") pod \"aee649fd-9006-4b86-9ef0-98eb482a70c4\" (UID: \"aee649fd-9006-4b86-9ef0-98eb482a70c4\") " Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.328034 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-utilities" (OuterVolumeSpecName: "utilities") pod "aee649fd-9006-4b86-9ef0-98eb482a70c4" (UID: "aee649fd-9006-4b86-9ef0-98eb482a70c4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.333553 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aee649fd-9006-4b86-9ef0-98eb482a70c4-kube-api-access-qrzsc" (OuterVolumeSpecName: "kube-api-access-qrzsc") pod "aee649fd-9006-4b86-9ef0-98eb482a70c4" (UID: "aee649fd-9006-4b86-9ef0-98eb482a70c4"). InnerVolumeSpecName "kube-api-access-qrzsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.427637 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.427702 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrzsc\" (UniqueName: \"kubernetes.io/projected/aee649fd-9006-4b86-9ef0-98eb482a70c4-kube-api-access-qrzsc\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.497871 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xl58v"] Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.662592 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "aee649fd-9006-4b86-9ef0-98eb482a70c4" (UID: "aee649fd-9006-4b86-9ef0-98eb482a70c4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.731579 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/aee649fd-9006-4b86-9ef0-98eb482a70c4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.737417 4958 generic.go:334] "Generic (PLEG): container finished" podID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerID="37093a1fd30e9c59aaef224a98275a472ca68915b7e67fb422b37267ded4d3ca" exitCode=0 Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.737503 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerDied","Data":"37093a1fd30e9c59aaef224a98275a472ca68915b7e67fb422b37267ded4d3ca"} Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.737555 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerStarted","Data":"5f9f730cd5cfe5505bb788ddcd1b442f53608a7101e37fdecb19fd306a7d56df"} Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.742970 4958 generic.go:334] "Generic (PLEG): container finished" podID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerID="5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62" exitCode=0 Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.743058 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-jbgr7" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.745855 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerDied","Data":"5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62"} Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.751857 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-jbgr7" event={"ID":"aee649fd-9006-4b86-9ef0-98eb482a70c4","Type":"ContainerDied","Data":"e5d7b6325b2e31c385f57044b5193aa43b34b1cfab80487cf3e94d50ffe67f6b"} Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.751889 4958 scope.go:117] "RemoveContainer" containerID="5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.785054 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-jbgr7"] Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.785236 4958 scope.go:117] "RemoveContainer" containerID="31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.789289 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-jbgr7"] Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.810692 4958 scope.go:117] "RemoveContainer" containerID="02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.833676 4958 scope.go:117] "RemoveContainer" containerID="5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62" Mar 20 09:16:05 crc kubenswrapper[4958]: E0320 09:16:05.834121 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62\": container with ID starting with 5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62 not found: ID does not exist" containerID="5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.834163 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62"} err="failed to get container status \"5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62\": rpc error: code = NotFound desc = could not find container \"5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62\": container with ID starting with 5f9e69242485e2b902e38c27e1475aa41970a1195a63ca93d3a57a0355a40a62 not found: ID does not exist" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.834196 4958 scope.go:117] "RemoveContainer" containerID="31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602" Mar 20 09:16:05 crc kubenswrapper[4958]: E0320 09:16:05.834590 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602\": container with ID starting with 31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602 not found: ID does not exist" containerID="31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.834635 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602"} err="failed to get container status \"31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602\": rpc error: code = NotFound desc = could not find container \"31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602\": container with ID starting with 31254863e86b4e0af6e16bfe8426c21c109dbf5383260f48733abd6d93b47602 not found: ID does not exist" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.834653 4958 scope.go:117] "RemoveContainer" containerID="02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3" Mar 20 09:16:05 crc kubenswrapper[4958]: E0320 09:16:05.835323 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3\": container with ID starting with 02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3 not found: ID does not exist" containerID="02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3" Mar 20 09:16:05 crc kubenswrapper[4958]: I0320 09:16:05.835382 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3"} err="failed to get container status \"02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3\": rpc error: code = NotFound desc = could not find container \"02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3\": container with ID starting with 02199a2ad5bfaeefd34373a40982b54a07fd72a5fcee30514c128ec86541a1b3 not found: ID does not exist" Mar 20 09:16:06 crc kubenswrapper[4958]: I0320 09:16:06.455056 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5413de9b-2a29-40e8-ace1-8bcd650af14a" path="/var/lib/kubelet/pods/5413de9b-2a29-40e8-ace1-8bcd650af14a/volumes" Mar 20 09:16:06 crc kubenswrapper[4958]: I0320 09:16:06.456319 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" path="/var/lib/kubelet/pods/aee649fd-9006-4b86-9ef0-98eb482a70c4/volumes" Mar 20 09:16:06 crc kubenswrapper[4958]: I0320 09:16:06.751026 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerStarted","Data":"9270600c9a613ca13327bb0f9dd18e2eb7add95e7406ab4eda3d5ccd36ead98a"} Mar 20 09:16:07 crc kubenswrapper[4958]: I0320 09:16:07.760101 4958 generic.go:334] "Generic (PLEG): container finished" podID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerID="9270600c9a613ca13327bb0f9dd18e2eb7add95e7406ab4eda3d5ccd36ead98a" exitCode=0 Mar 20 09:16:07 crc kubenswrapper[4958]: I0320 09:16:07.760162 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerDied","Data":"9270600c9a613ca13327bb0f9dd18e2eb7add95e7406ab4eda3d5ccd36ead98a"} Mar 20 09:16:08 crc kubenswrapper[4958]: I0320 09:16:08.769299 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerStarted","Data":"0f3bca0c0f31cfde2bfeb4640e0eec22c08525d82b98f9facdd4f7fa71e6b11a"} Mar 20 09:16:08 crc kubenswrapper[4958]: I0320 09:16:08.788934 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xl58v" podStartSLOduration=2.118543945 podStartE2EDuration="4.78891354s" podCreationTimestamp="2026-03-20 09:16:04 +0000 UTC" firstStartedPulling="2026-03-20 09:16:05.740403548 +0000 UTC m=+986.062419506" lastFinishedPulling="2026-03-20 09:16:08.410773143 +0000 UTC m=+988.732789101" observedRunningTime="2026-03-20 09:16:08.788111898 +0000 UTC m=+989.110127846" watchObservedRunningTime="2026-03-20 09:16:08.78891354 +0000 UTC m=+989.110929498" Mar 20 09:16:09 crc kubenswrapper[4958]: I0320 09:16:09.998762 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-65b48c4558-h8dcf" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.693901 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-jsg5p"] Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.694652 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="extract-content" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.694673 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="extract-content" Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.694695 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="registry-server" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.694701 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="registry-server" Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.694713 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="extract-utilities" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.694720 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="extract-utilities" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.694868 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="aee649fd-9006-4b86-9ef0-98eb482a70c4" containerName="registry-server" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.697227 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.700678 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.702292 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.702567 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-4kpvh" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.704696 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-metrics\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.704861 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3669e607-3d8e-4e9e-8468-26d0032e0590-metrics-certs\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.705043 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpqch\" (UniqueName: \"kubernetes.io/projected/3669e607-3d8e-4e9e-8468-26d0032e0590-kube-api-access-bpqch\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.705135 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-startup\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.705203 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-reloader\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.705257 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-sockets\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.705291 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-conf\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.713327 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj"] Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.714925 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.718412 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.728069 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj"] Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.801059 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-zt86p"] Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.802154 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.804682 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-kvbqx" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.805122 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.805653 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806037 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-startup\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806083 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4ph7\" (UniqueName: \"kubernetes.io/projected/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-kube-api-access-c4ph7\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806115 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-reloader\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806146 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-sockets\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806167 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806187 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-metallb-excludel2\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806207 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-conf\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806230 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806257 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjdd7\" (UniqueName: \"kubernetes.io/projected/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-kube-api-access-zjdd7\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806292 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-metrics-certs\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806328 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-metrics\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806353 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3669e607-3d8e-4e9e-8468-26d0032e0590-metrics-certs\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.806386 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpqch\" (UniqueName: \"kubernetes.io/projected/3669e607-3d8e-4e9e-8468-26d0032e0590-kube-api-access-bpqch\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.807242 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-metrics\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.807406 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-conf\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.807685 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-sockets\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.807780 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/3669e607-3d8e-4e9e-8468-26d0032e0590-reloader\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.807874 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/3669e607-3d8e-4e9e-8468-26d0032e0590-frr-startup\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.808445 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.815740 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/3669e607-3d8e-4e9e-8468-26d0032e0590-metrics-certs\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.826122 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-hwwvd"] Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.827207 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.830901 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.839832 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpqch\" (UniqueName: \"kubernetes.io/projected/3669e607-3d8e-4e9e-8468-26d0032e0590-kube-api-access-bpqch\") pod \"frr-k8s-jsg5p\" (UID: \"3669e607-3d8e-4e9e-8468-26d0032e0590\") " pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.862320 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-hwwvd"] Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907609 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-metallb-excludel2\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907706 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkltj\" (UniqueName: \"kubernetes.io/projected/d29fc852-1061-4f79-a204-3dc6a4f73e6c-kube-api-access-zkltj\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907734 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d29fc852-1061-4f79-a204-3dc6a4f73e6c-metrics-certs\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907760 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907784 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjdd7\" (UniqueName: \"kubernetes.io/projected/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-kube-api-access-zjdd7\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907805 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-metrics-certs\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907853 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d29fc852-1061-4f79-a204-3dc6a4f73e6c-cert\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.907901 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4ph7\" (UniqueName: \"kubernetes.io/projected/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-kube-api-access-c4ph7\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.908420 4958 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.908486 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist podName:83a41007-6a0b-499e-b7e0-5dbaabb47a9c nodeName:}" failed. No retries permitted until 2026-03-20 09:16:11.408465709 +0000 UTC m=+991.730481667 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist") pod "speaker-zt86p" (UID: "83a41007-6a0b-499e-b7e0-5dbaabb47a9c") : secret "metallb-memberlist" not found Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.909380 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-metallb-excludel2\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.909470 4958 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 09:16:10 crc kubenswrapper[4958]: E0320 09:16:10.909499 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-cert podName:82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb nodeName:}" failed. No retries permitted until 2026-03-20 09:16:11.409487117 +0000 UTC m=+991.731503075 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-cert") pod "frr-k8s-webhook-server-bcc4b6f68-wbqjj" (UID: "82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb") : secret "frr-k8s-webhook-server-cert" not found Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.925583 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-metrics-certs\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.931469 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjdd7\" (UniqueName: \"kubernetes.io/projected/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-kube-api-access-zjdd7\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:10 crc kubenswrapper[4958]: I0320 09:16:10.932507 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4ph7\" (UniqueName: \"kubernetes.io/projected/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-kube-api-access-c4ph7\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.009082 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d29fc852-1061-4f79-a204-3dc6a4f73e6c-metrics-certs\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.009210 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d29fc852-1061-4f79-a204-3dc6a4f73e6c-cert\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.009284 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkltj\" (UniqueName: \"kubernetes.io/projected/d29fc852-1061-4f79-a204-3dc6a4f73e6c-kube-api-access-zkltj\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.013392 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d29fc852-1061-4f79-a204-3dc6a4f73e6c-metrics-certs\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.014975 4958 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.016972 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.024015 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/d29fc852-1061-4f79-a204-3dc6a4f73e6c-cert\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.031224 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkltj\" (UniqueName: \"kubernetes.io/projected/d29fc852-1061-4f79-a204-3dc6a4f73e6c-kube-api-access-zkltj\") pod \"controller-7bb4cc7c98-hwwvd\" (UID: \"d29fc852-1061-4f79-a204-3dc6a4f73e6c\") " pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.186003 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.416763 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.416848 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:11 crc kubenswrapper[4958]: E0320 09:16:11.417029 4958 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 09:16:11 crc kubenswrapper[4958]: E0320 09:16:11.417504 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist podName:83a41007-6a0b-499e-b7e0-5dbaabb47a9c nodeName:}" failed. No retries permitted until 2026-03-20 09:16:12.417471181 +0000 UTC m=+992.739487139 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist") pod "speaker-zt86p" (UID: "83a41007-6a0b-499e-b7e0-5dbaabb47a9c") : secret "metallb-memberlist" not found Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.423253 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-wbqjj\" (UID: \"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.508074 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-hwwvd"] Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.627769 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.789333 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"25b37b99dec4f28034065262dc5f014c0abeccdf697735c6594379d413ae41e3"} Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.790776 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hwwvd" event={"ID":"d29fc852-1061-4f79-a204-3dc6a4f73e6c","Type":"ContainerStarted","Data":"f4288e8334d527d1c31e17907d81e714fedeb169cc76904b173728b83f7c4054"} Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.790803 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hwwvd" event={"ID":"d29fc852-1061-4f79-a204-3dc6a4f73e6c","Type":"ContainerStarted","Data":"04d6f95c5a910889e209c83017cb7dee1e51e7612539434866f039dad12f820b"} Mar 20 09:16:11 crc kubenswrapper[4958]: I0320 09:16:11.908840 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj"] Mar 20 09:16:11 crc kubenswrapper[4958]: W0320 09:16:11.915057 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod82a3e089_0afe_4bc8_addb_c3e2ceb6bbfb.slice/crio-4a50bb1018f1a6ebffe5981a74b2c665a1f26b4ec1134e7ad3226f2a4dfce1e8 WatchSource:0}: Error finding container 4a50bb1018f1a6ebffe5981a74b2c665a1f26b4ec1134e7ad3226f2a4dfce1e8: Status 404 returned error can't find the container with id 4a50bb1018f1a6ebffe5981a74b2c665a1f26b4ec1134e7ad3226f2a4dfce1e8 Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.438049 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.445681 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/83a41007-6a0b-499e-b7e0-5dbaabb47a9c-memberlist\") pod \"speaker-zt86p\" (UID: \"83a41007-6a0b-499e-b7e0-5dbaabb47a9c\") " pod="metallb-system/speaker-zt86p" Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.675949 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-zt86p" Mar 20 09:16:12 crc kubenswrapper[4958]: W0320 09:16:12.704953 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83a41007_6a0b_499e_b7e0_5dbaabb47a9c.slice/crio-d88ef50c1c798fbeb4ee97b0d23ed78d5d86c0d84032577b31b79711449a9aaf WatchSource:0}: Error finding container d88ef50c1c798fbeb4ee97b0d23ed78d5d86c0d84032577b31b79711449a9aaf: Status 404 returned error can't find the container with id d88ef50c1c798fbeb4ee97b0d23ed78d5d86c0d84032577b31b79711449a9aaf Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.799939 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-hwwvd" event={"ID":"d29fc852-1061-4f79-a204-3dc6a4f73e6c","Type":"ContainerStarted","Data":"0c9a2771843450a9ec68b11f1c856d3f1dbcf451c7b3488aad0d065e705be782"} Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.800173 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.804854 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zt86p" event={"ID":"83a41007-6a0b-499e-b7e0-5dbaabb47a9c","Type":"ContainerStarted","Data":"d88ef50c1c798fbeb4ee97b0d23ed78d5d86c0d84032577b31b79711449a9aaf"} Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.811940 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" event={"ID":"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb","Type":"ContainerStarted","Data":"4a50bb1018f1a6ebffe5981a74b2c665a1f26b4ec1134e7ad3226f2a4dfce1e8"} Mar 20 09:16:12 crc kubenswrapper[4958]: I0320 09:16:12.825891 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-hwwvd" podStartSLOduration=2.8258689329999997 podStartE2EDuration="2.825868933s" podCreationTimestamp="2026-03-20 09:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:16:12.82105093 +0000 UTC m=+993.143066908" watchObservedRunningTime="2026-03-20 09:16:12.825868933 +0000 UTC m=+993.147884891" Mar 20 09:16:13 crc kubenswrapper[4958]: I0320 09:16:13.838910 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zt86p" event={"ID":"83a41007-6a0b-499e-b7e0-5dbaabb47a9c","Type":"ContainerStarted","Data":"a5d5f626d5d6d54b1b1c59d6738fdbc40dbd2ccf2a2e7acbe49c2418fcb89dc1"} Mar 20 09:16:13 crc kubenswrapper[4958]: I0320 09:16:13.839871 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-zt86p" Mar 20 09:16:13 crc kubenswrapper[4958]: I0320 09:16:13.839888 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-zt86p" event={"ID":"83a41007-6a0b-499e-b7e0-5dbaabb47a9c","Type":"ContainerStarted","Data":"549cbeac3a179516ec703612bebc45324b24ebb4214ffd34c39647978713a734"} Mar 20 09:16:13 crc kubenswrapper[4958]: I0320 09:16:13.876397 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-zt86p" podStartSLOduration=3.876373605 podStartE2EDuration="3.876373605s" podCreationTimestamp="2026-03-20 09:16:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:16:13.865784403 +0000 UTC m=+994.187800361" watchObservedRunningTime="2026-03-20 09:16:13.876373605 +0000 UTC m=+994.198389563" Mar 20 09:16:15 crc kubenswrapper[4958]: I0320 09:16:15.213791 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:15 crc kubenswrapper[4958]: I0320 09:16:15.213848 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:15 crc kubenswrapper[4958]: I0320 09:16:15.265200 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:15 crc kubenswrapper[4958]: I0320 09:16:15.937089 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:16 crc kubenswrapper[4958]: I0320 09:16:16.001827 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xl58v"] Mar 20 09:16:17 crc kubenswrapper[4958]: I0320 09:16:17.891789 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xl58v" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="registry-server" containerID="cri-o://0f3bca0c0f31cfde2bfeb4640e0eec22c08525d82b98f9facdd4f7fa71e6b11a" gracePeriod=2 Mar 20 09:16:18 crc kubenswrapper[4958]: I0320 09:16:18.903696 4958 generic.go:334] "Generic (PLEG): container finished" podID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerID="0f3bca0c0f31cfde2bfeb4640e0eec22c08525d82b98f9facdd4f7fa71e6b11a" exitCode=0 Mar 20 09:16:18 crc kubenswrapper[4958]: I0320 09:16:18.903765 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerDied","Data":"0f3bca0c0f31cfde2bfeb4640e0eec22c08525d82b98f9facdd4f7fa71e6b11a"} Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.206883 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.368300 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-utilities\") pod \"f94d4da7-b898-454c-b5d8-119bb782d6cb\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.368439 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-75sk6\" (UniqueName: \"kubernetes.io/projected/f94d4da7-b898-454c-b5d8-119bb782d6cb-kube-api-access-75sk6\") pod \"f94d4da7-b898-454c-b5d8-119bb782d6cb\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.368556 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-catalog-content\") pod \"f94d4da7-b898-454c-b5d8-119bb782d6cb\" (UID: \"f94d4da7-b898-454c-b5d8-119bb782d6cb\") " Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.370215 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-utilities" (OuterVolumeSpecName: "utilities") pod "f94d4da7-b898-454c-b5d8-119bb782d6cb" (UID: "f94d4da7-b898-454c-b5d8-119bb782d6cb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.376866 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f94d4da7-b898-454c-b5d8-119bb782d6cb-kube-api-access-75sk6" (OuterVolumeSpecName: "kube-api-access-75sk6") pod "f94d4da7-b898-454c-b5d8-119bb782d6cb" (UID: "f94d4da7-b898-454c-b5d8-119bb782d6cb"). InnerVolumeSpecName "kube-api-access-75sk6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.422336 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "f94d4da7-b898-454c-b5d8-119bb782d6cb" (UID: "f94d4da7-b898-454c-b5d8-119bb782d6cb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.471190 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-75sk6\" (UniqueName: \"kubernetes.io/projected/f94d4da7-b898-454c-b5d8-119bb782d6cb-kube-api-access-75sk6\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.471236 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.471246 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/f94d4da7-b898-454c-b5d8-119bb782d6cb-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.956687 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" event={"ID":"82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb","Type":"ContainerStarted","Data":"a814025a33b8de4cbc077cedf0efe034c7d5117505f43cf3e84aa30904fee47d"} Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.956789 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.958910 4958 generic.go:334] "Generic (PLEG): container finished" podID="3669e607-3d8e-4e9e-8468-26d0032e0590" containerID="fd0568756422b52246f28a42c920e40a3c6fe0ca226d671812af1dc7a383e04c" exitCode=0 Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.959041 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerDied","Data":"fd0568756422b52246f28a42c920e40a3c6fe0ca226d671812af1dc7a383e04c"} Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.961669 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xl58v" event={"ID":"f94d4da7-b898-454c-b5d8-119bb782d6cb","Type":"ContainerDied","Data":"5f9f730cd5cfe5505bb788ddcd1b442f53608a7101e37fdecb19fd306a7d56df"} Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.961728 4958 scope.go:117] "RemoveContainer" containerID="0f3bca0c0f31cfde2bfeb4640e0eec22c08525d82b98f9facdd4f7fa71e6b11a" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.961897 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xl58v" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.979452 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" podStartSLOduration=3.003693005 podStartE2EDuration="10.979426877s" podCreationTimestamp="2026-03-20 09:16:10 +0000 UTC" firstStartedPulling="2026-03-20 09:16:11.919186153 +0000 UTC m=+992.241202111" lastFinishedPulling="2026-03-20 09:16:19.894920025 +0000 UTC m=+1000.216935983" observedRunningTime="2026-03-20 09:16:20.978272496 +0000 UTC m=+1001.300288474" watchObservedRunningTime="2026-03-20 09:16:20.979426877 +0000 UTC m=+1001.301442845" Mar 20 09:16:20 crc kubenswrapper[4958]: I0320 09:16:20.993551 4958 scope.go:117] "RemoveContainer" containerID="9270600c9a613ca13327bb0f9dd18e2eb7add95e7406ab4eda3d5ccd36ead98a" Mar 20 09:16:21 crc kubenswrapper[4958]: I0320 09:16:21.002697 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xl58v"] Mar 20 09:16:21 crc kubenswrapper[4958]: I0320 09:16:21.012008 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xl58v"] Mar 20 09:16:21 crc kubenswrapper[4958]: I0320 09:16:21.017231 4958 scope.go:117] "RemoveContainer" containerID="37093a1fd30e9c59aaef224a98275a472ca68915b7e67fb422b37267ded4d3ca" Mar 20 09:16:21 crc kubenswrapper[4958]: I0320 09:16:21.192717 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-hwwvd" Mar 20 09:16:21 crc kubenswrapper[4958]: I0320 09:16:21.974469 4958 generic.go:334] "Generic (PLEG): container finished" podID="3669e607-3d8e-4e9e-8468-26d0032e0590" containerID="afb09b7adb7bc8896c135638018127643dcca275f3c4fd4b7667f10899206789" exitCode=0 Mar 20 09:16:21 crc kubenswrapper[4958]: I0320 09:16:21.974578 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerDied","Data":"afb09b7adb7bc8896c135638018127643dcca275f3c4fd4b7667f10899206789"} Mar 20 09:16:22 crc kubenswrapper[4958]: I0320 09:16:22.445931 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" path="/var/lib/kubelet/pods/f94d4da7-b898-454c-b5d8-119bb782d6cb/volumes" Mar 20 09:16:22 crc kubenswrapper[4958]: I0320 09:16:22.987070 4958 generic.go:334] "Generic (PLEG): container finished" podID="3669e607-3d8e-4e9e-8468-26d0032e0590" containerID="fd21d43d676c2d6196b07c9cea6cebdc321e9bc0b9f7cd77e9429f7104cab2cd" exitCode=0 Mar 20 09:16:22 crc kubenswrapper[4958]: I0320 09:16:22.987132 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerDied","Data":"fd21d43d676c2d6196b07c9cea6cebdc321e9bc0b9f7cd77e9429f7104cab2cd"} Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.469076 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2dx"] Mar 20 09:16:23 crc kubenswrapper[4958]: E0320 09:16:23.474019 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="extract-content" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.474058 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="extract-content" Mar 20 09:16:23 crc kubenswrapper[4958]: E0320 09:16:23.474092 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="extract-utilities" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.474101 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="extract-utilities" Mar 20 09:16:23 crc kubenswrapper[4958]: E0320 09:16:23.474117 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="registry-server" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.474128 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="registry-server" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.474296 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="f94d4da7-b898-454c-b5d8-119bb782d6cb" containerName="registry-server" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.476110 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.490744 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2dx"] Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.620860 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-kube-api-access-kqvlq\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.621057 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-utilities\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.621291 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-catalog-content\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.725866 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-catalog-content\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.725966 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-kube-api-access-kqvlq\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.726011 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-utilities\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.726955 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-utilities\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.726994 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-catalog-content\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.756869 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-kube-api-access-kqvlq\") pod \"redhat-marketplace-wn2dx\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:23 crc kubenswrapper[4958]: I0320 09:16:23.833806 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:24 crc kubenswrapper[4958]: I0320 09:16:24.022552 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"09cb77e603f44e3927da60992f2bd58b149272fe5ce81d008bd028dd6bc10d16"} Mar 20 09:16:24 crc kubenswrapper[4958]: I0320 09:16:24.022635 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"3cad9b03030eac66f2e9f01e7abdb835fabb758e13848704067d0766f4f727fa"} Mar 20 09:16:24 crc kubenswrapper[4958]: I0320 09:16:24.022654 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"7197c2becd9c6c49079c1e26f93022b8cde35aa349de446bc75c82bf84158ad7"} Mar 20 09:16:24 crc kubenswrapper[4958]: I0320 09:16:24.022667 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"62c33ff4f6dffe3f20968d2ce9af3e4866fcd52b7f1add52060dc33b47fd36c5"} Mar 20 09:16:24 crc kubenswrapper[4958]: I0320 09:16:24.022680 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"6e7d1a82a01640ceb8bbecb7df65cf5c98882f656df1323e2d4e23e000608c33"} Mar 20 09:16:24 crc kubenswrapper[4958]: I0320 09:16:24.352639 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2dx"] Mar 20 09:16:24 crc kubenswrapper[4958]: W0320 09:16:24.358080 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbcdcd227_04bf_4b5d_9048_358ecf2cbf14.slice/crio-8b1fe3dd9f858d2aed12c73f10e38fb81c7c1c86f5ccd3cf150526a02356b1c0 WatchSource:0}: Error finding container 8b1fe3dd9f858d2aed12c73f10e38fb81c7c1c86f5ccd3cf150526a02356b1c0: Status 404 returned error can't find the container with id 8b1fe3dd9f858d2aed12c73f10e38fb81c7c1c86f5ccd3cf150526a02356b1c0 Mar 20 09:16:25 crc kubenswrapper[4958]: I0320 09:16:25.033231 4958 generic.go:334] "Generic (PLEG): container finished" podID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerID="f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3" exitCode=0 Mar 20 09:16:25 crc kubenswrapper[4958]: I0320 09:16:25.033324 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerDied","Data":"f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3"} Mar 20 09:16:25 crc kubenswrapper[4958]: I0320 09:16:25.033862 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerStarted","Data":"8b1fe3dd9f858d2aed12c73f10e38fb81c7c1c86f5ccd3cf150526a02356b1c0"} Mar 20 09:16:25 crc kubenswrapper[4958]: I0320 09:16:25.039545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-jsg5p" event={"ID":"3669e607-3d8e-4e9e-8468-26d0032e0590","Type":"ContainerStarted","Data":"5d8a2819d0929edf59cc2ba8ed6df61bcd3d3bba5ab889da674102937f3f1955"} Mar 20 09:16:25 crc kubenswrapper[4958]: I0320 09:16:25.039950 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:25 crc kubenswrapper[4958]: I0320 09:16:25.092747 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-jsg5p" podStartSLOduration=6.422701345 podStartE2EDuration="15.092720819s" podCreationTimestamp="2026-03-20 09:16:10 +0000 UTC" firstStartedPulling="2026-03-20 09:16:11.170676753 +0000 UTC m=+991.492692711" lastFinishedPulling="2026-03-20 09:16:19.840696227 +0000 UTC m=+1000.162712185" observedRunningTime="2026-03-20 09:16:25.086995461 +0000 UTC m=+1005.409011419" watchObservedRunningTime="2026-03-20 09:16:25.092720819 +0000 UTC m=+1005.414736777" Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.019366 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.048333 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerStarted","Data":"d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13"} Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.065119 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.521041 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.521518 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.521925 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.522796 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d50121cef1dafbc948002311d0250ee4e915179ff897da522e2cdd9606be5fc6"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:16:26 crc kubenswrapper[4958]: I0320 09:16:26.522872 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://d50121cef1dafbc948002311d0250ee4e915179ff897da522e2cdd9606be5fc6" gracePeriod=600 Mar 20 09:16:27 crc kubenswrapper[4958]: I0320 09:16:27.062656 4958 generic.go:334] "Generic (PLEG): container finished" podID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerID="d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13" exitCode=0 Mar 20 09:16:27 crc kubenswrapper[4958]: I0320 09:16:27.062749 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerDied","Data":"d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13"} Mar 20 09:16:27 crc kubenswrapper[4958]: I0320 09:16:27.068180 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="d50121cef1dafbc948002311d0250ee4e915179ff897da522e2cdd9606be5fc6" exitCode=0 Mar 20 09:16:27 crc kubenswrapper[4958]: I0320 09:16:27.069761 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"d50121cef1dafbc948002311d0250ee4e915179ff897da522e2cdd9606be5fc6"} Mar 20 09:16:27 crc kubenswrapper[4958]: I0320 09:16:27.069861 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"007b6668849ff989fcaab0fedbd591707a471a4800519c18e47480ba1f688088"} Mar 20 09:16:27 crc kubenswrapper[4958]: I0320 09:16:27.069993 4958 scope.go:117] "RemoveContainer" containerID="cddc3aaf749f620c4810fa0b2192721051e7b180c369b36b46b439825fe97a42" Mar 20 09:16:28 crc kubenswrapper[4958]: I0320 09:16:28.079453 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerStarted","Data":"8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c"} Mar 20 09:16:28 crc kubenswrapper[4958]: I0320 09:16:28.100760 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-wn2dx" podStartSLOduration=2.400955076 podStartE2EDuration="5.100745364s" podCreationTimestamp="2026-03-20 09:16:23 +0000 UTC" firstStartedPulling="2026-03-20 09:16:25.036012433 +0000 UTC m=+1005.358028391" lastFinishedPulling="2026-03-20 09:16:27.735802711 +0000 UTC m=+1008.057818679" observedRunningTime="2026-03-20 09:16:28.099422818 +0000 UTC m=+1008.421438776" watchObservedRunningTime="2026-03-20 09:16:28.100745364 +0000 UTC m=+1008.422761322" Mar 20 09:16:31 crc kubenswrapper[4958]: I0320 09:16:31.631802 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-wbqjj" Mar 20 09:16:32 crc kubenswrapper[4958]: I0320 09:16:32.680437 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-zt86p" Mar 20 09:16:33 crc kubenswrapper[4958]: I0320 09:16:33.834429 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:33 crc kubenswrapper[4958]: I0320 09:16:33.835077 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:33 crc kubenswrapper[4958]: I0320 09:16:33.893681 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:34 crc kubenswrapper[4958]: I0320 09:16:34.180419 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:34 crc kubenswrapper[4958]: I0320 09:16:34.238731 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2dx"] Mar 20 09:16:36 crc kubenswrapper[4958]: I0320 09:16:36.140521 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-wn2dx" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="registry-server" containerID="cri-o://8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c" gracePeriod=2 Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.043179 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.149646 4958 generic.go:334] "Generic (PLEG): container finished" podID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerID="8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c" exitCode=0 Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.149686 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerDied","Data":"8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c"} Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.149728 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-wn2dx" event={"ID":"bcdcd227-04bf-4b5d-9048-358ecf2cbf14","Type":"ContainerDied","Data":"8b1fe3dd9f858d2aed12c73f10e38fb81c7c1c86f5ccd3cf150526a02356b1c0"} Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.149750 4958 scope.go:117] "RemoveContainer" containerID="8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.149787 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-wn2dx" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.152163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-utilities\") pod \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.152436 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-catalog-content\") pod \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.152642 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-kube-api-access-kqvlq\") pod \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\" (UID: \"bcdcd227-04bf-4b5d-9048-358ecf2cbf14\") " Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.153892 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-utilities" (OuterVolumeSpecName: "utilities") pod "bcdcd227-04bf-4b5d-9048-358ecf2cbf14" (UID: "bcdcd227-04bf-4b5d-9048-358ecf2cbf14"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.154052 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.158674 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-kube-api-access-kqvlq" (OuterVolumeSpecName: "kube-api-access-kqvlq") pod "bcdcd227-04bf-4b5d-9048-358ecf2cbf14" (UID: "bcdcd227-04bf-4b5d-9048-358ecf2cbf14"). InnerVolumeSpecName "kube-api-access-kqvlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.177542 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bcdcd227-04bf-4b5d-9048-358ecf2cbf14" (UID: "bcdcd227-04bf-4b5d-9048-358ecf2cbf14"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.185324 4958 scope.go:117] "RemoveContainer" containerID="d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.205634 4958 scope.go:117] "RemoveContainer" containerID="f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.224647 4958 scope.go:117] "RemoveContainer" containerID="8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c" Mar 20 09:16:37 crc kubenswrapper[4958]: E0320 09:16:37.225095 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c\": container with ID starting with 8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c not found: ID does not exist" containerID="8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.225128 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c"} err="failed to get container status \"8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c\": rpc error: code = NotFound desc = could not find container \"8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c\": container with ID starting with 8cbf4a06779b5b9113d0f736ff0a2fe2775422bdb8ad7dc754257b3c1409d66c not found: ID does not exist" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.225151 4958 scope.go:117] "RemoveContainer" containerID="d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13" Mar 20 09:16:37 crc kubenswrapper[4958]: E0320 09:16:37.225419 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13\": container with ID starting with d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13 not found: ID does not exist" containerID="d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.225440 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13"} err="failed to get container status \"d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13\": rpc error: code = NotFound desc = could not find container \"d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13\": container with ID starting with d26546dc76bbb9d1574e31a81f2f88bab84ef4d2d552dbca1fb7a1f236c61a13 not found: ID does not exist" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.225451 4958 scope.go:117] "RemoveContainer" containerID="f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3" Mar 20 09:16:37 crc kubenswrapper[4958]: E0320 09:16:37.225686 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3\": container with ID starting with f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3 not found: ID does not exist" containerID="f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.225723 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3"} err="failed to get container status \"f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3\": rpc error: code = NotFound desc = could not find container \"f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3\": container with ID starting with f3dc050e00e0895ab19064dbd252220b845cf685a8e9da786847caabf06b1cf3 not found: ID does not exist" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.255407 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.255443 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqvlq\" (UniqueName: \"kubernetes.io/projected/bcdcd227-04bf-4b5d-9048-358ecf2cbf14-kube-api-access-kqvlq\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.491969 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2dx"] Mar 20 09:16:37 crc kubenswrapper[4958]: I0320 09:16:37.499666 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-wn2dx"] Mar 20 09:16:38 crc kubenswrapper[4958]: I0320 09:16:38.445284 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" path="/var/lib/kubelet/pods/bcdcd227-04bf-4b5d-9048-358ecf2cbf14/volumes" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.734009 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-2rnn4"] Mar 20 09:16:39 crc kubenswrapper[4958]: E0320 09:16:39.734489 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="extract-utilities" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.734502 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="extract-utilities" Mar 20 09:16:39 crc kubenswrapper[4958]: E0320 09:16:39.734519 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="registry-server" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.734525 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="registry-server" Mar 20 09:16:39 crc kubenswrapper[4958]: E0320 09:16:39.734539 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="extract-content" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.734547 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="extract-content" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.734673 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcdcd227-04bf-4b5d-9048-358ecf2cbf14" containerName="registry-server" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.735065 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.737349 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-gng54" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.737557 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.737835 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.749479 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2rnn4"] Mar 20 09:16:39 crc kubenswrapper[4958]: I0320 09:16:39.900188 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n4kjs\" (UniqueName: \"kubernetes.io/projected/6b91c78e-0310-4789-b3ef-caede75e5d1c-kube-api-access-n4kjs\") pod \"openstack-operator-index-2rnn4\" (UID: \"6b91c78e-0310-4789-b3ef-caede75e5d1c\") " pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:40 crc kubenswrapper[4958]: I0320 09:16:40.001948 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n4kjs\" (UniqueName: \"kubernetes.io/projected/6b91c78e-0310-4789-b3ef-caede75e5d1c-kube-api-access-n4kjs\") pod \"openstack-operator-index-2rnn4\" (UID: \"6b91c78e-0310-4789-b3ef-caede75e5d1c\") " pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:40 crc kubenswrapper[4958]: I0320 09:16:40.023863 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n4kjs\" (UniqueName: \"kubernetes.io/projected/6b91c78e-0310-4789-b3ef-caede75e5d1c-kube-api-access-n4kjs\") pod \"openstack-operator-index-2rnn4\" (UID: \"6b91c78e-0310-4789-b3ef-caede75e5d1c\") " pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:40 crc kubenswrapper[4958]: I0320 09:16:40.052351 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:40 crc kubenswrapper[4958]: I0320 09:16:40.267043 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-2rnn4"] Mar 20 09:16:41 crc kubenswrapper[4958]: I0320 09:16:41.020576 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-jsg5p" Mar 20 09:16:41 crc kubenswrapper[4958]: I0320 09:16:41.184968 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2rnn4" event={"ID":"6b91c78e-0310-4789-b3ef-caede75e5d1c","Type":"ContainerStarted","Data":"5bea1362d9245668f4f18614a0df510846ea7f14222c4084d01457be4566fde1"} Mar 20 09:16:43 crc kubenswrapper[4958]: I0320 09:16:43.096996 4958 scope.go:117] "RemoveContainer" containerID="eaa790f1e58f13748a111e56b30e665d6c527510bd44d967abb6893d5871028e" Mar 20 09:16:43 crc kubenswrapper[4958]: I0320 09:16:43.197826 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-2rnn4" event={"ID":"6b91c78e-0310-4789-b3ef-caede75e5d1c","Type":"ContainerStarted","Data":"0b751c6e50c5490514e9a3d8196634ba7263b8bb6b7895d95bd2bf17ffb51016"} Mar 20 09:16:43 crc kubenswrapper[4958]: I0320 09:16:43.221930 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-2rnn4" podStartSLOduration=2.075576869 podStartE2EDuration="4.221897906s" podCreationTimestamp="2026-03-20 09:16:39 +0000 UTC" firstStartedPulling="2026-03-20 09:16:40.283576539 +0000 UTC m=+1020.605592497" lastFinishedPulling="2026-03-20 09:16:42.429897566 +0000 UTC m=+1022.751913534" observedRunningTime="2026-03-20 09:16:43.217200497 +0000 UTC m=+1023.539216475" watchObservedRunningTime="2026-03-20 09:16:43.221897906 +0000 UTC m=+1023.543913884" Mar 20 09:16:50 crc kubenswrapper[4958]: I0320 09:16:50.053290 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:50 crc kubenswrapper[4958]: I0320 09:16:50.053966 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:50 crc kubenswrapper[4958]: I0320 09:16:50.080849 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:50 crc kubenswrapper[4958]: I0320 09:16:50.282226 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-2rnn4" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.171649 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb"] Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.172995 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.175313 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-mhlrs" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.180065 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb"] Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.368661 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8ttd\" (UniqueName: \"kubernetes.io/projected/c8c24479-3659-4655-a67b-e4601afe1b52-kube-api-access-d8ttd\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.368792 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-util\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.368861 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-bundle\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.471219 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8ttd\" (UniqueName: \"kubernetes.io/projected/c8c24479-3659-4655-a67b-e4601afe1b52-kube-api-access-d8ttd\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.471325 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-util\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.471373 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-bundle\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.472245 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-util\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.472353 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-bundle\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.493861 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8ttd\" (UniqueName: \"kubernetes.io/projected/c8c24479-3659-4655-a67b-e4601afe1b52-kube-api-access-d8ttd\") pod \"b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:51 crc kubenswrapper[4958]: I0320 09:16:51.790377 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:52 crc kubenswrapper[4958]: I0320 09:16:52.241266 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb"] Mar 20 09:16:53 crc kubenswrapper[4958]: I0320 09:16:53.274144 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8c24479-3659-4655-a67b-e4601afe1b52" containerID="91b73b37acacf0af48e6afbb57fd4e8856f565f5615e8e7f8941fad6d94112b6" exitCode=0 Mar 20 09:16:53 crc kubenswrapper[4958]: I0320 09:16:53.274208 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" event={"ID":"c8c24479-3659-4655-a67b-e4601afe1b52","Type":"ContainerDied","Data":"91b73b37acacf0af48e6afbb57fd4e8856f565f5615e8e7f8941fad6d94112b6"} Mar 20 09:16:53 crc kubenswrapper[4958]: I0320 09:16:53.274545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" event={"ID":"c8c24479-3659-4655-a67b-e4601afe1b52","Type":"ContainerStarted","Data":"bb63756264488ff0be4b6ba937b696ba580238912405fc5c23e031f66cbb5f75"} Mar 20 09:16:54 crc kubenswrapper[4958]: I0320 09:16:54.284422 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8c24479-3659-4655-a67b-e4601afe1b52" containerID="28ec590eff5d29c4b0a57486ba79c2c6d8b6254577d910ed3338a955117fe29b" exitCode=0 Mar 20 09:16:54 crc kubenswrapper[4958]: I0320 09:16:54.284518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" event={"ID":"c8c24479-3659-4655-a67b-e4601afe1b52","Type":"ContainerDied","Data":"28ec590eff5d29c4b0a57486ba79c2c6d8b6254577d910ed3338a955117fe29b"} Mar 20 09:16:55 crc kubenswrapper[4958]: I0320 09:16:55.295176 4958 generic.go:334] "Generic (PLEG): container finished" podID="c8c24479-3659-4655-a67b-e4601afe1b52" containerID="1844a637bd67fe52f4f2b4601e933ab0b751987c41e22aff7abbebdeb207c5a7" exitCode=0 Mar 20 09:16:55 crc kubenswrapper[4958]: I0320 09:16:55.295283 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" event={"ID":"c8c24479-3659-4655-a67b-e4601afe1b52","Type":"ContainerDied","Data":"1844a637bd67fe52f4f2b4601e933ab0b751987c41e22aff7abbebdeb207c5a7"} Mar 20 09:16:56 crc kubenswrapper[4958]: I0320 09:16:56.880317 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.061667 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8ttd\" (UniqueName: \"kubernetes.io/projected/c8c24479-3659-4655-a67b-e4601afe1b52-kube-api-access-d8ttd\") pod \"c8c24479-3659-4655-a67b-e4601afe1b52\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.061798 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-bundle\") pod \"c8c24479-3659-4655-a67b-e4601afe1b52\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.061879 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-util\") pod \"c8c24479-3659-4655-a67b-e4601afe1b52\" (UID: \"c8c24479-3659-4655-a67b-e4601afe1b52\") " Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.063755 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-bundle" (OuterVolumeSpecName: "bundle") pod "c8c24479-3659-4655-a67b-e4601afe1b52" (UID: "c8c24479-3659-4655-a67b-e4601afe1b52"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.071671 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c24479-3659-4655-a67b-e4601afe1b52-kube-api-access-d8ttd" (OuterVolumeSpecName: "kube-api-access-d8ttd") pod "c8c24479-3659-4655-a67b-e4601afe1b52" (UID: "c8c24479-3659-4655-a67b-e4601afe1b52"). InnerVolumeSpecName "kube-api-access-d8ttd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.081276 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-util" (OuterVolumeSpecName: "util") pod "c8c24479-3659-4655-a67b-e4601afe1b52" (UID: "c8c24479-3659-4655-a67b-e4601afe1b52"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.164204 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8ttd\" (UniqueName: \"kubernetes.io/projected/c8c24479-3659-4655-a67b-e4601afe1b52-kube-api-access-d8ttd\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.164271 4958 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.164287 4958 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/c8c24479-3659-4655-a67b-e4601afe1b52-util\") on node \"crc\" DevicePath \"\"" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.315146 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" event={"ID":"c8c24479-3659-4655-a67b-e4601afe1b52","Type":"ContainerDied","Data":"bb63756264488ff0be4b6ba937b696ba580238912405fc5c23e031f66cbb5f75"} Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.315194 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bb63756264488ff0be4b6ba937b696ba580238912405fc5c23e031f66cbb5f75" Mar 20 09:16:57 crc kubenswrapper[4958]: I0320 09:16:57.315256 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.129054 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj"] Mar 20 09:17:02 crc kubenswrapper[4958]: E0320 09:17:02.129704 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="pull" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.129722 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="pull" Mar 20 09:17:02 crc kubenswrapper[4958]: E0320 09:17:02.129740 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="extract" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.129748 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="extract" Mar 20 09:17:02 crc kubenswrapper[4958]: E0320 09:17:02.129764 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="util" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.129772 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="util" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.129918 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c24479-3659-4655-a67b-e4601afe1b52" containerName="extract" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.130422 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.132399 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-m48pt" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.136304 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hksbv\" (UniqueName: \"kubernetes.io/projected/72562712-a7df-49b8-af2c-6482fd0dcef0-kube-api-access-hksbv\") pod \"openstack-operator-controller-init-9df8dd5fd-2jzxj\" (UID: \"72562712-a7df-49b8-af2c-6482fd0dcef0\") " pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.165152 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj"] Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.237622 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hksbv\" (UniqueName: \"kubernetes.io/projected/72562712-a7df-49b8-af2c-6482fd0dcef0-kube-api-access-hksbv\") pod \"openstack-operator-controller-init-9df8dd5fd-2jzxj\" (UID: \"72562712-a7df-49b8-af2c-6482fd0dcef0\") " pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.259949 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hksbv\" (UniqueName: \"kubernetes.io/projected/72562712-a7df-49b8-af2c-6482fd0dcef0-kube-api-access-hksbv\") pod \"openstack-operator-controller-init-9df8dd5fd-2jzxj\" (UID: \"72562712-a7df-49b8-af2c-6482fd0dcef0\") " pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.453420 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:02 crc kubenswrapper[4958]: I0320 09:17:02.902097 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj"] Mar 20 09:17:03 crc kubenswrapper[4958]: I0320 09:17:03.362896 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" event={"ID":"72562712-a7df-49b8-af2c-6482fd0dcef0","Type":"ContainerStarted","Data":"5bfa35c03de9d5a2cfda55f346fab55d995ffc790b971d05de717ccd69fc461e"} Mar 20 09:17:07 crc kubenswrapper[4958]: I0320 09:17:07.390420 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" event={"ID":"72562712-a7df-49b8-af2c-6482fd0dcef0","Type":"ContainerStarted","Data":"d76e8b47ea223d16edff9695590aa0826034df7c516b96be70420005125d2f8b"} Mar 20 09:17:07 crc kubenswrapper[4958]: I0320 09:17:07.391460 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:07 crc kubenswrapper[4958]: I0320 09:17:07.429200 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" podStartSLOduration=1.37693719 podStartE2EDuration="5.429180764s" podCreationTimestamp="2026-03-20 09:17:02 +0000 UTC" firstStartedPulling="2026-03-20 09:17:02.912316587 +0000 UTC m=+1043.234332555" lastFinishedPulling="2026-03-20 09:17:06.964560171 +0000 UTC m=+1047.286576129" observedRunningTime="2026-03-20 09:17:07.425444471 +0000 UTC m=+1047.747460449" watchObservedRunningTime="2026-03-20 09:17:07.429180764 +0000 UTC m=+1047.751196732" Mar 20 09:17:12 crc kubenswrapper[4958]: I0320 09:17:12.456868 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-9df8dd5fd-2jzxj" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.665570 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.667042 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.669799 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-xdn7c" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.674241 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.675477 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.678714 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-l54mg" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.684472 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.688972 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.694175 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.725975 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-9kqq8" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.734692 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.745899 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.750240 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.758863 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.770907 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-p75x6" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.805702 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.837877 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.838864 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.841298 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rz4wm\" (UniqueName: \"kubernetes.io/projected/afb56adf-873a-4757-90cb-62cc57e78669-kube-api-access-rz4wm\") pod \"barbican-operator-controller-manager-59bc569d95-tlzr6\" (UID: \"afb56adf-873a-4757-90cb-62cc57e78669\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.841550 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qklb4\" (UniqueName: \"kubernetes.io/projected/668ba749-8ef8-42fc-bb13-7b5c6e207ed6-kube-api-access-qklb4\") pod \"cinder-operator-controller-manager-8d58dc466-4ljl2\" (UID: \"668ba749-8ef8-42fc-bb13-7b5c6e207ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.841668 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mshk9\" (UniqueName: \"kubernetes.io/projected/07df28d7-7683-4309-bee9-9aa2de96b9ce-kube-api-access-mshk9\") pod \"designate-operator-controller-manager-588d4d986b-j4w4r\" (UID: \"07df28d7-7683-4309-bee9-9aa2de96b9ce\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.846894 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-hh42p" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.856770 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.857542 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:17:30 crc kubenswrapper[4958]: W0320 09:17:30.863760 4958 reflector.go:561] object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-82n8k": failed to list *v1.Secret: secrets "horizon-operator-controller-manager-dockercfg-82n8k" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 20 09:17:30 crc kubenswrapper[4958]: E0320 09:17:30.863814 4958 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"horizon-operator-controller-manager-dockercfg-82n8k\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"horizon-operator-controller-manager-dockercfg-82n8k\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.884709 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.904085 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.934919 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-577ccd856-pms6v"] Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.935859 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.941047 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-48rbz" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.941330 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.942571 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45h52\" (UniqueName: \"kubernetes.io/projected/22ddf7c6-5d86-436a-b6ea-a622e854725e-kube-api-access-45h52\") pod \"heat-operator-controller-manager-67dd5f86f5-b8zbp\" (UID: \"22ddf7c6-5d86-436a-b6ea-a622e854725e\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.942647 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qklb4\" (UniqueName: \"kubernetes.io/projected/668ba749-8ef8-42fc-bb13-7b5c6e207ed6-kube-api-access-qklb4\") pod \"cinder-operator-controller-manager-8d58dc466-4ljl2\" (UID: \"668ba749-8ef8-42fc-bb13-7b5c6e207ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.942671 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mshk9\" (UniqueName: \"kubernetes.io/projected/07df28d7-7683-4309-bee9-9aa2de96b9ce-kube-api-access-mshk9\") pod \"designate-operator-controller-manager-588d4d986b-j4w4r\" (UID: \"07df28d7-7683-4309-bee9-9aa2de96b9ce\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.942721 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rz4wm\" (UniqueName: \"kubernetes.io/projected/afb56adf-873a-4757-90cb-62cc57e78669-kube-api-access-rz4wm\") pod \"barbican-operator-controller-manager-59bc569d95-tlzr6\" (UID: \"afb56adf-873a-4757-90cb-62cc57e78669\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.942756 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rb264\" (UniqueName: \"kubernetes.io/projected/b381ba24-046d-4474-8581-6235812526a7-kube-api-access-rb264\") pod \"glance-operator-controller-manager-79df6bcc97-2f897\" (UID: \"b381ba24-046d-4474-8581-6235812526a7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:17:30 crc kubenswrapper[4958]: I0320 09:17:30.956756 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-577ccd856-pms6v"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.015382 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qklb4\" (UniqueName: \"kubernetes.io/projected/668ba749-8ef8-42fc-bb13-7b5c6e207ed6-kube-api-access-qklb4\") pod \"cinder-operator-controller-manager-8d58dc466-4ljl2\" (UID: \"668ba749-8ef8-42fc-bb13-7b5c6e207ed6\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.027041 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rz4wm\" (UniqueName: \"kubernetes.io/projected/afb56adf-873a-4757-90cb-62cc57e78669-kube-api-access-rz4wm\") pod \"barbican-operator-controller-manager-59bc569d95-tlzr6\" (UID: \"afb56adf-873a-4757-90cb-62cc57e78669\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.030924 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.032415 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.040332 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mshk9\" (UniqueName: \"kubernetes.io/projected/07df28d7-7683-4309-bee9-9aa2de96b9ce-kube-api-access-mshk9\") pod \"designate-operator-controller-manager-588d4d986b-j4w4r\" (UID: \"07df28d7-7683-4309-bee9-9aa2de96b9ce\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.043741 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbtx7\" (UniqueName: \"kubernetes.io/projected/6d3c18bd-2666-4490-afbb-dbb844e5dc36-kube-api-access-pbtx7\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.043865 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ls2s2\" (UniqueName: \"kubernetes.io/projected/60ab48da-f2e7-47d0-829e-922b0726e372-kube-api-access-ls2s2\") pod \"horizon-operator-controller-manager-8464cc45fb-69br5\" (UID: \"60ab48da-f2e7-47d0-829e-922b0726e372\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.043956 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rb264\" (UniqueName: \"kubernetes.io/projected/b381ba24-046d-4474-8581-6235812526a7-kube-api-access-rb264\") pod \"glance-operator-controller-manager-79df6bcc97-2f897\" (UID: \"b381ba24-046d-4474-8581-6235812526a7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.043998 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45h52\" (UniqueName: \"kubernetes.io/projected/22ddf7c6-5d86-436a-b6ea-a622e854725e-kube-api-access-45h52\") pod \"heat-operator-controller-manager-67dd5f86f5-b8zbp\" (UID: \"22ddf7c6-5d86-436a-b6ea-a622e854725e\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.044059 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.046753 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-hhkcb" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.062656 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.064339 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.086052 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.086202 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-4smr7" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.086949 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.098713 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rb264\" (UniqueName: \"kubernetes.io/projected/b381ba24-046d-4474-8581-6235812526a7-kube-api-access-rb264\") pod \"glance-operator-controller-manager-79df6bcc97-2f897\" (UID: \"b381ba24-046d-4474-8581-6235812526a7\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:17:31 crc kubenswrapper[4958]: W0320 09:17:31.116750 4958 reflector.go:561] object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-x5k8x": failed to list *v1.Secret: secrets "manila-operator-controller-manager-dockercfg-x5k8x" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openstack-operators": no relationship found between node 'crc' and this object Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.116820 4958 reflector.go:158] "Unhandled Error" err="object-\"openstack-operators\"/\"manila-operator-controller-manager-dockercfg-x5k8x\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"manila-operator-controller-manager-dockercfg-x5k8x\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openstack-operators\": no relationship found between node 'crc' and this object" logger="UnhandledError" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.116854 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.155049 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45h52\" (UniqueName: \"kubernetes.io/projected/22ddf7c6-5d86-436a-b6ea-a622e854725e-kube-api-access-45h52\") pod \"heat-operator-controller-manager-67dd5f86f5-b8zbp\" (UID: \"22ddf7c6-5d86-436a-b6ea-a622e854725e\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.155884 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbtx7\" (UniqueName: \"kubernetes.io/projected/6d3c18bd-2666-4490-afbb-dbb844e5dc36-kube-api-access-pbtx7\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.155985 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.156717 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ls2s2\" (UniqueName: \"kubernetes.io/projected/60ab48da-f2e7-47d0-829e-922b0726e372-kube-api-access-ls2s2\") pod \"horizon-operator-controller-manager-8464cc45fb-69br5\" (UID: \"60ab48da-f2e7-47d0-829e-922b0726e372\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.156764 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.158041 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.156797 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jhffs\" (UniqueName: \"kubernetes.io/projected/46972026-e8fb-46c0-bd8a-93d33a1eaccd-kube-api-access-jhffs\") pod \"keystone-operator-controller-manager-768b96df4c-fvr27\" (UID: \"46972026-e8fb-46c0-bd8a-93d33a1eaccd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.158723 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hzdnx\" (UniqueName: \"kubernetes.io/projected/af8e40f1-7e87-4ed7-8136-1ec1ad714bac-kube-api-access-hzdnx\") pod \"ironic-operator-controller-manager-6f787dddc9-wq2w4\" (UID: \"af8e40f1-7e87-4ed7-8136-1ec1ad714bac\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.159040 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krwnz\" (UniqueName: \"kubernetes.io/projected/9d54ed62-2236-4fdc-9fdb-f2042817795e-kube-api-access-krwnz\") pod \"manila-operator-controller-manager-55f864c847-ch6hb\" (UID: \"9d54ed62-2236-4fdc-9fdb-f2042817795e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.159357 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.160436 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.160732 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert podName:6d3c18bd-2666-4490-afbb-dbb844e5dc36 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:31.660704558 +0000 UTC m=+1071.982720516 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert") pod "infra-operator-controller-manager-577ccd856-pms6v" (UID: "6d3c18bd-2666-4490-afbb-dbb844e5dc36") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.201343 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.243868 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-sx8tz" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.262445 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.264367 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jhffs\" (UniqueName: \"kubernetes.io/projected/46972026-e8fb-46c0-bd8a-93d33a1eaccd-kube-api-access-jhffs\") pod \"keystone-operator-controller-manager-768b96df4c-fvr27\" (UID: \"46972026-e8fb-46c0-bd8a-93d33a1eaccd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.264462 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hzdnx\" (UniqueName: \"kubernetes.io/projected/af8e40f1-7e87-4ed7-8136-1ec1ad714bac-kube-api-access-hzdnx\") pod \"ironic-operator-controller-manager-6f787dddc9-wq2w4\" (UID: \"af8e40f1-7e87-4ed7-8136-1ec1ad714bac\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.264505 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-krwnz\" (UniqueName: \"kubernetes.io/projected/9d54ed62-2236-4fdc-9fdb-f2042817795e-kube-api-access-krwnz\") pod \"manila-operator-controller-manager-55f864c847-ch6hb\" (UID: \"9d54ed62-2236-4fdc-9fdb-f2042817795e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.286411 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbtx7\" (UniqueName: \"kubernetes.io/projected/6d3c18bd-2666-4490-afbb-dbb844e5dc36-kube-api-access-pbtx7\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.289963 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.299439 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.309272 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.311714 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.315297 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.318586 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ls2s2\" (UniqueName: \"kubernetes.io/projected/60ab48da-f2e7-47d0-829e-922b0726e372-kube-api-access-ls2s2\") pod \"horizon-operator-controller-manager-8464cc45fb-69br5\" (UID: \"60ab48da-f2e7-47d0-829e-922b0726e372\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.329472 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-h78w2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.371211 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-krwnz\" (UniqueName: \"kubernetes.io/projected/9d54ed62-2236-4fdc-9fdb-f2042817795e-kube-api-access-krwnz\") pod \"manila-operator-controller-manager-55f864c847-ch6hb\" (UID: \"9d54ed62-2236-4fdc-9fdb-f2042817795e\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.371331 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hzdnx\" (UniqueName: \"kubernetes.io/projected/af8e40f1-7e87-4ed7-8136-1ec1ad714bac-kube-api-access-hzdnx\") pod \"ironic-operator-controller-manager-6f787dddc9-wq2w4\" (UID: \"af8e40f1-7e87-4ed7-8136-1ec1ad714bac\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.371440 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.371830 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jhffs\" (UniqueName: \"kubernetes.io/projected/46972026-e8fb-46c0-bd8a-93d33a1eaccd-kube-api-access-jhffs\") pod \"keystone-operator-controller-manager-768b96df4c-fvr27\" (UID: \"46972026-e8fb-46c0-bd8a-93d33a1eaccd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.372615 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5dqm\" (UniqueName: \"kubernetes.io/projected/4721bc9e-cb87-47df-a166-cdd08d38568d-kube-api-access-s5dqm\") pod \"mariadb-operator-controller-manager-67ccfc9778-5572j\" (UID: \"4721bc9e-cb87-47df-a166-cdd08d38568d\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.390793 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.395588 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.396609 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.404878 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-9wlbw" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.414727 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.457674 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.463428 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.476871 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.477230 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5dqm\" (UniqueName: \"kubernetes.io/projected/4721bc9e-cb87-47df-a166-cdd08d38568d-kube-api-access-s5dqm\") pod \"mariadb-operator-controller-manager-67ccfc9778-5572j\" (UID: \"4721bc9e-cb87-47df-a166-cdd08d38568d\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.477794 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtpjg\" (UniqueName: \"kubernetes.io/projected/7246ddd6-d5b3-48a0-8581-42e5ff63f6eb-kube-api-access-wtpjg\") pod \"neutron-operator-controller-manager-767865f676-qfwqm\" (UID: \"7246ddd6-d5b3-48a0-8581-42e5ff63f6eb\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.482371 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.516182 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.523962 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-zwtk9" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.564303 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5dqm\" (UniqueName: \"kubernetes.io/projected/4721bc9e-cb87-47df-a166-cdd08d38568d-kube-api-access-s5dqm\") pod \"mariadb-operator-controller-manager-67ccfc9778-5572j\" (UID: \"4721bc9e-cb87-47df-a166-cdd08d38568d\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.564387 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.576711 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-llgf2"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.578001 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.581435 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtpjg\" (UniqueName: \"kubernetes.io/projected/7246ddd6-d5b3-48a0-8581-42e5ff63f6eb-kube-api-access-wtpjg\") pod \"neutron-operator-controller-manager-767865f676-qfwqm\" (UID: \"7246ddd6-d5b3-48a0-8581-42e5ff63f6eb\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.581514 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8vl4\" (UniqueName: \"kubernetes.io/projected/049aadcd-754d-4c89-b1cf-8ae3aa2f7748-kube-api-access-n8vl4\") pod \"octavia-operator-controller-manager-5b9f45d989-bqxpp\" (UID: \"049aadcd-754d-4c89-b1cf-8ae3aa2f7748\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.581581 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qnf5\" (UniqueName: \"kubernetes.io/projected/1dc86ca0-19a7-44f2-90f4-40faf6f6308a-kube-api-access-7qnf5\") pod \"nova-operator-controller-manager-5d488d59fb-p95zp\" (UID: \"1dc86ca0-19a7-44f2-90f4-40faf6f6308a\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.581746 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-c9xkk" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.619769 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtpjg\" (UniqueName: \"kubernetes.io/projected/7246ddd6-d5b3-48a0-8581-42e5ff63f6eb-kube-api-access-wtpjg\") pod \"neutron-operator-controller-manager-767865f676-qfwqm\" (UID: \"7246ddd6-d5b3-48a0-8581-42e5ff63f6eb\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.630802 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-llgf2"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.671395 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.673534 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.685521 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8vl4\" (UniqueName: \"kubernetes.io/projected/049aadcd-754d-4c89-b1cf-8ae3aa2f7748-kube-api-access-n8vl4\") pod \"octavia-operator-controller-manager-5b9f45d989-bqxpp\" (UID: \"049aadcd-754d-4c89-b1cf-8ae3aa2f7748\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.685646 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qnf5\" (UniqueName: \"kubernetes.io/projected/1dc86ca0-19a7-44f2-90f4-40faf6f6308a-kube-api-access-7qnf5\") pod \"nova-operator-controller-manager-5d488d59fb-p95zp\" (UID: \"1dc86ca0-19a7-44f2-90f4-40faf6f6308a\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.685678 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt7gn\" (UniqueName: \"kubernetes.io/projected/88be297b-cdd1-4b8d-ae88-eb6219f0f156-kube-api-access-kt7gn\") pod \"ovn-operator-controller-manager-884679f54-llgf2\" (UID: \"88be297b-cdd1-4b8d-ae88-eb6219f0f156\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.685757 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7k4m\" (UniqueName: \"kubernetes.io/projected/70f92bb8-0cc8-4804-a8d9-d5d3441e953e-kube-api-access-v7k4m\") pod \"placement-operator-controller-manager-5784578c99-pg9qm\" (UID: \"70f92bb8-0cc8-4804-a8d9-d5d3441e953e\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.685791 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.685960 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.686028 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert podName:6d3c18bd-2666-4490-afbb-dbb844e5dc36 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:32.686005305 +0000 UTC m=+1073.008021263 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert") pod "infra-operator-controller-manager-577ccd856-pms6v" (UID: "6d3c18bd-2666-4490-afbb-dbb844e5dc36") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.690725 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.698130 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.703129 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-l45z9" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.703834 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.716008 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.717308 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kt85q" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.720639 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.735672 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.756403 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qnf5\" (UniqueName: \"kubernetes.io/projected/1dc86ca0-19a7-44f2-90f4-40faf6f6308a-kube-api-access-7qnf5\") pod \"nova-operator-controller-manager-5d488d59fb-p95zp\" (UID: \"1dc86ca0-19a7-44f2-90f4-40faf6f6308a\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.768213 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8vl4\" (UniqueName: \"kubernetes.io/projected/049aadcd-754d-4c89-b1cf-8ae3aa2f7748-kube-api-access-n8vl4\") pod \"octavia-operator-controller-manager-5b9f45d989-bqxpp\" (UID: \"049aadcd-754d-4c89-b1cf-8ae3aa2f7748\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.779216 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.789133 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.789335 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7k4m\" (UniqueName: \"kubernetes.io/projected/70f92bb8-0cc8-4804-a8d9-d5d3441e953e-kube-api-access-v7k4m\") pod \"placement-operator-controller-manager-5784578c99-pg9qm\" (UID: \"70f92bb8-0cc8-4804-a8d9-d5d3441e953e\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.791842 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtwzw\" (UniqueName: \"kubernetes.io/projected/58536825-54ec-4942-a17e-50d7db114ff9-kube-api-access-mtwzw\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.791994 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt7gn\" (UniqueName: \"kubernetes.io/projected/88be297b-cdd1-4b8d-ae88-eb6219f0f156-kube-api-access-kt7gn\") pod \"ovn-operator-controller-manager-884679f54-llgf2\" (UID: \"88be297b-cdd1-4b8d-ae88-eb6219f0f156\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.808342 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.809367 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.817807 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-k4v6x" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.829383 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7k4m\" (UniqueName: \"kubernetes.io/projected/70f92bb8-0cc8-4804-a8d9-d5d3441e953e-kube-api-access-v7k4m\") pod \"placement-operator-controller-manager-5784578c99-pg9qm\" (UID: \"70f92bb8-0cc8-4804-a8d9-d5d3441e953e\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.832532 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt7gn\" (UniqueName: \"kubernetes.io/projected/88be297b-cdd1-4b8d-ae88-eb6219f0f156-kube-api-access-kt7gn\") pod \"ovn-operator-controller-manager-884679f54-llgf2\" (UID: \"88be297b-cdd1-4b8d-ae88-eb6219f0f156\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.846391 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.848766 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.849796 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.867347 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.871842 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-mmbc2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.895014 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6x77\" (UniqueName: \"kubernetes.io/projected/b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc-kube-api-access-c6x77\") pod \"swift-operator-controller-manager-c674c5965-pfz7r\" (UID: \"b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.895085 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmplr\" (UniqueName: \"kubernetes.io/projected/934a0099-92f4-4fd1-b910-28c8a0f50d1e-kube-api-access-jmplr\") pod \"telemetry-operator-controller-manager-d6b694c5-d8b2d\" (UID: \"934a0099-92f4-4fd1-b910-28c8a0f50d1e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.895172 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.895254 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtwzw\" (UniqueName: \"kubernetes.io/projected/58536825-54ec-4942-a17e-50d7db114ff9-kube-api-access-mtwzw\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.895820 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:31 crc kubenswrapper[4958]: E0320 09:17:31.895882 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert podName:58536825-54ec-4942-a17e-50d7db114ff9 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:32.395860552 +0000 UTC m=+1072.717876510 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" (UID: "58536825-54ec-4942-a17e-50d7db114ff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.934567 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.956446 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.964471 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtwzw\" (UniqueName: \"kubernetes.io/projected/58536825-54ec-4942-a17e-50d7db114ff9-kube-api-access-mtwzw\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.989017 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d"] Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.997190 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c6x77\" (UniqueName: \"kubernetes.io/projected/b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc-kube-api-access-c6x77\") pod \"swift-operator-controller-manager-c674c5965-pfz7r\" (UID: \"b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:17:31 crc kubenswrapper[4958]: I0320 09:17:31.997661 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jmplr\" (UniqueName: \"kubernetes.io/projected/934a0099-92f4-4fd1-b910-28c8a0f50d1e-kube-api-access-jmplr\") pod \"telemetry-operator-controller-manager-d6b694c5-d8b2d\" (UID: \"934a0099-92f4-4fd1-b910-28c8a0f50d1e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.009055 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.023520 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.025809 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c6x77\" (UniqueName: \"kubernetes.io/projected/b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc-kube-api-access-c6x77\") pod \"swift-operator-controller-manager-c674c5965-pfz7r\" (UID: \"b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.037009 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.037367 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.038323 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.038647 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-r4s6t" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.041118 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-8kjzp" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.042591 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmplr\" (UniqueName: \"kubernetes.io/projected/934a0099-92f4-4fd1-b910-28c8a0f50d1e-kube-api-access-jmplr\") pod \"telemetry-operator-controller-manager-d6b694c5-d8b2d\" (UID: \"934a0099-92f4-4fd1-b910-28c8a0f50d1e\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.046980 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.060030 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.099735 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fr44\" (UniqueName: \"kubernetes.io/projected/21dbcd45-579e-42ed-a2ac-c0b9fc9482b8-kube-api-access-4fr44\") pod \"test-operator-controller-manager-5c5cb9c4d7-glfmx\" (UID: \"21dbcd45-579e-42ed-a2ac-c0b9fc9482b8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.099816 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sh2qx\" (UniqueName: \"kubernetes.io/projected/6db78af7-a32c-44b8-8450-d9478c3f9b1f-kube-api-access-sh2qx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-tl8ls\" (UID: \"6db78af7-a32c-44b8-8450-d9478c3f9b1f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.142590 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.158535 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.170924 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.171251 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.171528 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vj7hz" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.179209 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.204259 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.204369 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fr44\" (UniqueName: \"kubernetes.io/projected/21dbcd45-579e-42ed-a2ac-c0b9fc9482b8-kube-api-access-4fr44\") pod \"test-operator-controller-manager-5c5cb9c4d7-glfmx\" (UID: \"21dbcd45-579e-42ed-a2ac-c0b9fc9482b8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.204418 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sh2qx\" (UniqueName: \"kubernetes.io/projected/6db78af7-a32c-44b8-8450-d9478c3f9b1f-kube-api-access-sh2qx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-tl8ls\" (UID: \"6db78af7-a32c-44b8-8450-d9478c3f9b1f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.206786 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdnj2\" (UniqueName: \"kubernetes.io/projected/90e05567-054f-41de-a1b4-4dc11ae039db-kube-api-access-fdnj2\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.206913 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.236869 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.239267 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.246117 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-82n8k" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.248246 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fr44\" (UniqueName: \"kubernetes.io/projected/21dbcd45-579e-42ed-a2ac-c0b9fc9482b8-kube-api-access-4fr44\") pod \"test-operator-controller-manager-5c5cb9c4d7-glfmx\" (UID: \"21dbcd45-579e-42ed-a2ac-c0b9fc9482b8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.248944 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.257860 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.269582 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sh2qx\" (UniqueName: \"kubernetes.io/projected/6db78af7-a32c-44b8-8450-d9478c3f9b1f-kube-api-access-sh2qx\") pod \"watcher-operator-controller-manager-6c4d75f7f9-tl8ls\" (UID: \"6db78af7-a32c-44b8-8450-d9478c3f9b1f\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.271387 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.308352 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.308468 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.309101 4958 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.309354 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.309904 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdnj2\" (UniqueName: \"kubernetes.io/projected/90e05567-054f-41de-a1b4-4dc11ae039db-kube-api-access-fdnj2\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.309999 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:32.809976843 +0000 UTC m=+1073.131992791 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "metrics-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.310018 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:32.810012264 +0000 UTC m=+1073.132028222 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.335857 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-x5k8x" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.340618 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdnj2\" (UniqueName: \"kubernetes.io/projected/90e05567-054f-41de-a1b4-4dc11ae039db-kube-api-access-fdnj2\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.341717 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.349452 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.411546 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.411816 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.411887 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert podName:58536825-54ec-4942-a17e-50d7db114ff9 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:33.411865903 +0000 UTC m=+1073.733881861 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" (UID: "58536825-54ec-4942-a17e-50d7db114ff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.482432 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.490295 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6"] Mar 20 09:17:32 crc kubenswrapper[4958]: W0320 09:17:32.524321 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod07df28d7_7683_4309_bee9_9aa2de96b9ce.slice/crio-b03c5df0815901fe234337c601b27d9a17e3b7983d4acd2f4456de55c0162423 WatchSource:0}: Error finding container b03c5df0815901fe234337c601b27d9a17e3b7983d4acd2f4456de55c0162423: Status 404 returned error can't find the container with id b03c5df0815901fe234337c601b27d9a17e3b7983d4acd2f4456de55c0162423 Mar 20 09:17:32 crc kubenswrapper[4958]: W0320 09:17:32.525641 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podafb56adf_873a_4757_90cb_62cc57e78669.slice/crio-70425c666a750c3de262c4722de46619ee3c4bc0c632e100f2a937320fe683ed WatchSource:0}: Error finding container 70425c666a750c3de262c4722de46619ee3c4bc0c632e100f2a937320fe683ed: Status 404 returned error can't find the container with id 70425c666a750c3de262c4722de46619ee3c4bc0c632e100f2a937320fe683ed Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.533124 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.559911 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" event={"ID":"22ddf7c6-5d86-436a-b6ea-a622e854725e","Type":"ContainerStarted","Data":"c84bc3dea8bc63b73622b6991b8ce53151046de53cc35e6487334518969e7c15"} Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.561470 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" event={"ID":"b381ba24-046d-4474-8581-6235812526a7","Type":"ContainerStarted","Data":"69fa55d36fedb2a1911d22219105c1cc51f9068924d8dc1f5486ae9a122b5836"} Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.570724 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" event={"ID":"afb56adf-873a-4757-90cb-62cc57e78669","Type":"ContainerStarted","Data":"70425c666a750c3de262c4722de46619ee3c4bc0c632e100f2a937320fe683ed"} Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.578202 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" event={"ID":"07df28d7-7683-4309-bee9-9aa2de96b9ce","Type":"ContainerStarted","Data":"b03c5df0815901fe234337c601b27d9a17e3b7983d4acd2f4456de55c0162423"} Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.643242 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2"] Mar 20 09:17:32 crc kubenswrapper[4958]: W0320 09:17:32.659711 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod668ba749_8ef8_42fc_bb13_7b5c6e207ed6.slice/crio-5d681a91b6912d5d247b8400214a39f901daa5e89b752306943297f87dc366f6 WatchSource:0}: Error finding container 5d681a91b6912d5d247b8400214a39f901daa5e89b752306943297f87dc366f6: Status 404 returned error can't find the container with id 5d681a91b6912d5d247b8400214a39f901daa5e89b752306943297f87dc366f6 Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.665883 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4"] Mar 20 09:17:32 crc kubenswrapper[4958]: W0320 09:17:32.709882 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaf8e40f1_7e87_4ed7_8136_1ec1ad714bac.slice/crio-a0e0eec0eb8674bb128d8ba88b2b5cb47e0e7b29581039305f999f187b55580b WatchSource:0}: Error finding container a0e0eec0eb8674bb128d8ba88b2b5cb47e0e7b29581039305f999f187b55580b: Status 404 returned error can't find the container with id a0e0eec0eb8674bb128d8ba88b2b5cb47e0e7b29581039305f999f187b55580b Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.716228 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.716435 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.716577 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert podName:6d3c18bd-2666-4490-afbb-dbb844e5dc36 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:34.716515665 +0000 UTC m=+1075.038531783 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert") pod "infra-operator-controller-manager-577ccd856-pms6v" (UID: "6d3c18bd-2666-4490-afbb-dbb844e5dc36") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.786673 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.818422 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.818545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.818720 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.818781 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:33.818761434 +0000 UTC m=+1074.140777392 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "webhook-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.819149 4958 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: E0320 09:17:32.819184 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:33.819172635 +0000 UTC m=+1074.141188593 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "metrics-server-cert" not found Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.907419 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j"] Mar 20 09:17:32 crc kubenswrapper[4958]: I0320 09:17:32.951043 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm"] Mar 20 09:17:32 crc kubenswrapper[4958]: W0320 09:17:32.953453 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7246ddd6_d5b3_48a0_8581_42e5ff63f6eb.slice/crio-298f650f4b684c29e5287120408bccafe01da853fd25c3b230b219808e07b5c3 WatchSource:0}: Error finding container 298f650f4b684c29e5287120408bccafe01da853fd25c3b230b219808e07b5c3: Status 404 returned error can't find the container with id 298f650f4b684c29e5287120408bccafe01da853fd25c3b230b219808e07b5c3 Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.024199 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-llgf2"] Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.079220 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp"] Mar 20 09:17:33 crc kubenswrapper[4958]: W0320 09:17:33.094958 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1dc86ca0_19a7_44f2_90f4_40faf6f6308a.slice/crio-8e65d922d373ed227dd93013b9c2d22494e717ed2fb26046345370448070b2d9 WatchSource:0}: Error finding container 8e65d922d373ed227dd93013b9c2d22494e717ed2fb26046345370448070b2d9: Status 404 returned error can't find the container with id 8e65d922d373ed227dd93013b9c2d22494e717ed2fb26046345370448070b2d9 Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.214262 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm"] Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.225857 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d"] Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.235120 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp"] Mar 20 09:17:33 crc kubenswrapper[4958]: W0320 09:17:33.238985 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod934a0099_92f4_4fd1_b910_28c8a0f50d1e.slice/crio-8e77d1e178c93d9ab223b891efdf3059df86bfa8b701289026bd32cc3b30059c WatchSource:0}: Error finding container 8e77d1e178c93d9ab223b891efdf3059df86bfa8b701289026bd32cc3b30059c: Status 404 returned error can't find the container with id 8e77d1e178c93d9ab223b891efdf3059df86bfa8b701289026bd32cc3b30059c Mar 20 09:17:33 crc kubenswrapper[4958]: W0320 09:17:33.240442 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod70f92bb8_0cc8_4804_a8d9_d5d3441e953e.slice/crio-956354973c71e7c376598586329354321dc63cb38a5e0d92efae49e13b12bbce WatchSource:0}: Error finding container 956354973c71e7c376598586329354321dc63cb38a5e0d92efae49e13b12bbce: Status 404 returned error can't find the container with id 956354973c71e7c376598586329354321dc63cb38a5e0d92efae49e13b12bbce Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.313017 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5"] Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.319299 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r"] Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.335853 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx"] Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.339426 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ls2s2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-operator-controller-manager-8464cc45fb-69br5_openstack-operators(60ab48da-f2e7-47d0-829e-922b0726e372): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.340635 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" podUID="60ab48da-f2e7-47d0-829e-922b0726e372" Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.344689 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb"] Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.347088 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls"] Mar 20 09:17:33 crc kubenswrapper[4958]: W0320 09:17:33.350268 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21dbcd45_579e_42ed_a2ac_c0b9fc9482b8.slice/crio-d443e0a59a94c99064481110b11a53a1ba4989606d32d1b362155c5331c46237 WatchSource:0}: Error finding container d443e0a59a94c99064481110b11a53a1ba4989606d32d1b362155c5331c46237: Status 404 returned error can't find the container with id d443e0a59a94c99064481110b11a53a1ba4989606d32d1b362155c5331c46237 Mar 20 09:17:33 crc kubenswrapper[4958]: W0320 09:17:33.355898 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb9ad7ed0_c1c6_4e6e_ae98_29b02f2facdc.slice/crio-987e6645815e24c12e5c1696c30f2798bbdce2a19bb90dad9e1d0b25904c8d4c WatchSource:0}: Error finding container 987e6645815e24c12e5c1696c30f2798bbdce2a19bb90dad9e1d0b25904c8d4c: Status 404 returned error can't find the container with id 987e6645815e24c12e5c1696c30f2798bbdce2a19bb90dad9e1d0b25904c8d4c Mar 20 09:17:33 crc kubenswrapper[4958]: W0320 09:17:33.358734 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6db78af7_a32c_44b8_8450_d9478c3f9b1f.slice/crio-c85c8b40b6fa45bc4535218cda37162f6cf014b54e4bd541a56650fa455ce10c WatchSource:0}: Error finding container c85c8b40b6fa45bc4535218cda37162f6cf014b54e4bd541a56650fa455ce10c: Status 404 returned error can't find the container with id c85c8b40b6fa45bc4535218cda37162f6cf014b54e4bd541a56650fa455ce10c Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.360667 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-c6x77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-pfz7r_openstack-operators(b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.361462 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-sh2qx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-tl8ls_openstack-operators(6db78af7-a32c-44b8-8450-d9478c3f9b1f): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.361930 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-4fr44,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-glfmx_openstack-operators(21dbcd45-579e-42ed-a2ac-c0b9fc9482b8): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.362020 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" podUID="b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.362546 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" podUID="6db78af7-a32c-44b8-8450-d9478c3f9b1f" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.363652 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" podUID="21dbcd45-579e-42ed-a2ac-c0b9fc9482b8" Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.432518 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.432922 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.433396 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert podName:58536825-54ec-4942-a17e-50d7db114ff9 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:35.433372084 +0000 UTC m=+1075.755388042 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" (UID: "58536825-54ec-4942-a17e-50d7db114ff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.595399 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" event={"ID":"88be297b-cdd1-4b8d-ae88-eb6219f0f156","Type":"ContainerStarted","Data":"2747cdd32b6df5f5b53c3aa0393dda3343e3598a857b4b1c3f2c1dd9d3919d94"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.608262 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" event={"ID":"46972026-e8fb-46c0-bd8a-93d33a1eaccd","Type":"ContainerStarted","Data":"cc567ce85ec6ad4658fd98156caabacf25fcfc6b001fcf8a68b070597a2a0c89"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.611518 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" event={"ID":"934a0099-92f4-4fd1-b910-28c8a0f50d1e","Type":"ContainerStarted","Data":"8e77d1e178c93d9ab223b891efdf3059df86bfa8b701289026bd32cc3b30059c"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.614836 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" event={"ID":"60ab48da-f2e7-47d0-829e-922b0726e372","Type":"ContainerStarted","Data":"ffa6b3840006dc4754687ae506a5e300296ac480f3cea4f6c0c85520db51a470"} Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.617657 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" podUID="60ab48da-f2e7-47d0-829e-922b0726e372" Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.618881 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" event={"ID":"4721bc9e-cb87-47df-a166-cdd08d38568d","Type":"ContainerStarted","Data":"7a834f0aa600d3524db03586f55275ffad272d80290d15719695e8ccc8052067"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.621450 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" event={"ID":"668ba749-8ef8-42fc-bb13-7b5c6e207ed6","Type":"ContainerStarted","Data":"5d681a91b6912d5d247b8400214a39f901daa5e89b752306943297f87dc366f6"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.623573 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" event={"ID":"7246ddd6-d5b3-48a0-8581-42e5ff63f6eb","Type":"ContainerStarted","Data":"298f650f4b684c29e5287120408bccafe01da853fd25c3b230b219808e07b5c3"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.625971 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" event={"ID":"af8e40f1-7e87-4ed7-8136-1ec1ad714bac","Type":"ContainerStarted","Data":"a0e0eec0eb8674bb128d8ba88b2b5cb47e0e7b29581039305f999f187b55580b"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.627793 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" event={"ID":"b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc","Type":"ContainerStarted","Data":"987e6645815e24c12e5c1696c30f2798bbdce2a19bb90dad9e1d0b25904c8d4c"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.631417 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" event={"ID":"1dc86ca0-19a7-44f2-90f4-40faf6f6308a","Type":"ContainerStarted","Data":"8e65d922d373ed227dd93013b9c2d22494e717ed2fb26046345370448070b2d9"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.633095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" event={"ID":"21dbcd45-579e-42ed-a2ac-c0b9fc9482b8","Type":"ContainerStarted","Data":"d443e0a59a94c99064481110b11a53a1ba4989606d32d1b362155c5331c46237"} Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.636506 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" podUID="21dbcd45-579e-42ed-a2ac-c0b9fc9482b8" Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.636514 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" event={"ID":"6db78af7-a32c-44b8-8450-d9478c3f9b1f","Type":"ContainerStarted","Data":"c85c8b40b6fa45bc4535218cda37162f6cf014b54e4bd541a56650fa455ce10c"} Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.631609 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" podUID="b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.638400 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" podUID="6db78af7-a32c-44b8-8450-d9478c3f9b1f" Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.642855 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" event={"ID":"70f92bb8-0cc8-4804-a8d9-d5d3441e953e","Type":"ContainerStarted","Data":"956354973c71e7c376598586329354321dc63cb38a5e0d92efae49e13b12bbce"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.645204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" event={"ID":"049aadcd-754d-4c89-b1cf-8ae3aa2f7748","Type":"ContainerStarted","Data":"b42ed795b2573628b66142bad3ce063eb66f8ee133d2a0f827bd7e8b2449d42b"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.646824 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" event={"ID":"9d54ed62-2236-4fdc-9fdb-f2042817795e","Type":"ContainerStarted","Data":"54889033d87f6ad0448849d46fa7ac5b2e521d299e9c1fd8fa1dd20523deef32"} Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.840654 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:33 crc kubenswrapper[4958]: I0320 09:17:33.840845 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.840921 4958 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.841021 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.841067 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:35.841040407 +0000 UTC m=+1076.163056365 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "metrics-server-cert" not found Mar 20 09:17:33 crc kubenswrapper[4958]: E0320 09:17:33.841717 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:35.841702706 +0000 UTC m=+1076.163718664 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "webhook-server-cert" not found Mar 20 09:17:34 crc kubenswrapper[4958]: E0320 09:17:34.670199 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/horizon-operator@sha256:703ad3a2b749bce100f1e2a445312b65dc3b8b45e8c8ba59f311d3f8f3368113\\\"\"" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" podUID="60ab48da-f2e7-47d0-829e-922b0726e372" Mar 20 09:17:34 crc kubenswrapper[4958]: E0320 09:17:34.670239 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" podUID="21dbcd45-579e-42ed-a2ac-c0b9fc9482b8" Mar 20 09:17:34 crc kubenswrapper[4958]: E0320 09:17:34.670263 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" podUID="6db78af7-a32c-44b8-8450-d9478c3f9b1f" Mar 20 09:17:34 crc kubenswrapper[4958]: E0320 09:17:34.670533 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" podUID="b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc" Mar 20 09:17:34 crc kubenswrapper[4958]: I0320 09:17:34.779832 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:34 crc kubenswrapper[4958]: E0320 09:17:34.780871 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:34 crc kubenswrapper[4958]: E0320 09:17:34.780918 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert podName:6d3c18bd-2666-4490-afbb-dbb844e5dc36 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:38.780902946 +0000 UTC m=+1079.102918904 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert") pod "infra-operator-controller-manager-577ccd856-pms6v" (UID: "6d3c18bd-2666-4490-afbb-dbb844e5dc36") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:35 crc kubenswrapper[4958]: I0320 09:17:35.494035 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:35 crc kubenswrapper[4958]: E0320 09:17:35.494212 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:35 crc kubenswrapper[4958]: E0320 09:17:35.494277 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert podName:58536825-54ec-4942-a17e-50d7db114ff9 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:39.4942581 +0000 UTC m=+1079.816274058 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" (UID: "58536825-54ec-4942-a17e-50d7db114ff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:35 crc kubenswrapper[4958]: I0320 09:17:35.910427 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:35 crc kubenswrapper[4958]: I0320 09:17:35.910545 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:35 crc kubenswrapper[4958]: E0320 09:17:35.910719 4958 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:17:35 crc kubenswrapper[4958]: E0320 09:17:35.910828 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:39.910805277 +0000 UTC m=+1080.232821235 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "metrics-server-cert" not found Mar 20 09:17:35 crc kubenswrapper[4958]: E0320 09:17:35.911199 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:17:35 crc kubenswrapper[4958]: E0320 09:17:35.911246 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:39.911234779 +0000 UTC m=+1080.233250737 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "webhook-server-cert" not found Mar 20 09:17:38 crc kubenswrapper[4958]: I0320 09:17:38.808173 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:38 crc kubenswrapper[4958]: E0320 09:17:38.808384 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:38 crc kubenswrapper[4958]: E0320 09:17:38.808877 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert podName:6d3c18bd-2666-4490-afbb-dbb844e5dc36 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:46.80885347 +0000 UTC m=+1087.130869428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert") pod "infra-operator-controller-manager-577ccd856-pms6v" (UID: "6d3c18bd-2666-4490-afbb-dbb844e5dc36") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:39 crc kubenswrapper[4958]: I0320 09:17:39.521738 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:39 crc kubenswrapper[4958]: E0320 09:17:39.522318 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:39 crc kubenswrapper[4958]: E0320 09:17:39.522366 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert podName:58536825-54ec-4942-a17e-50d7db114ff9 nodeName:}" failed. No retries permitted until 2026-03-20 09:17:47.522350977 +0000 UTC m=+1087.844366935 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" (UID: "58536825-54ec-4942-a17e-50d7db114ff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:39 crc kubenswrapper[4958]: I0320 09:17:39.929675 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:39 crc kubenswrapper[4958]: I0320 09:17:39.930106 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:39 crc kubenswrapper[4958]: E0320 09:17:39.929956 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:17:39 crc kubenswrapper[4958]: E0320 09:17:39.930317 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:47.930301237 +0000 UTC m=+1088.252317195 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "webhook-server-cert" not found Mar 20 09:17:39 crc kubenswrapper[4958]: E0320 09:17:39.930258 4958 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:17:39 crc kubenswrapper[4958]: E0320 09:17:39.930729 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:17:47.930718468 +0000 UTC m=+1088.252734426 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "metrics-server-cert" not found Mar 20 09:17:46 crc kubenswrapper[4958]: I0320 09:17:46.848956 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:17:46 crc kubenswrapper[4958]: E0320 09:17:46.849171 4958 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:46 crc kubenswrapper[4958]: E0320 09:17:46.849712 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert podName:6d3c18bd-2666-4490-afbb-dbb844e5dc36 nodeName:}" failed. No retries permitted until 2026-03-20 09:18:02.849684821 +0000 UTC m=+1103.171700869 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert") pod "infra-operator-controller-manager-577ccd856-pms6v" (UID: "6d3c18bd-2666-4490-afbb-dbb844e5dc36") : secret "infra-operator-webhook-server-cert" not found Mar 20 09:17:46 crc kubenswrapper[4958]: E0320 09:17:46.971124 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55" Mar 20 09:17:46 crc kubenswrapper[4958]: E0320 09:17:46.971339 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-kt7gn,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-llgf2_openstack-operators(88be297b-cdd1-4b8d-ae88-eb6219f0f156): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:17:46 crc kubenswrapper[4958]: E0320 09:17:46.972641 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" podUID="88be297b-cdd1-4b8d-ae88-eb6219f0f156" Mar 20 09:17:47 crc kubenswrapper[4958]: I0320 09:17:47.564998 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.565474 4958 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.565544 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert podName:58536825-54ec-4942-a17e-50d7db114ff9 nodeName:}" failed. No retries permitted until 2026-03-20 09:18:03.565526112 +0000 UTC m=+1103.887542070 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" (UID: "58536825-54ec-4942-a17e-50d7db114ff9") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.646680 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad" Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.647393 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mshk9,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod designate-operator-controller-manager-588d4d986b-j4w4r_openstack-operators(07df28d7-7683-4309-bee9-9aa2de96b9ce): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.648734 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" podUID="07df28d7-7683-4309-bee9-9aa2de96b9ce" Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.903353 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/designate-operator@sha256:12841b27173f5f1beeb83112e057c8753f4cf411f583fba4f0610fac0f60b7ad\\\"\"" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" podUID="07df28d7-7683-4309-bee9-9aa2de96b9ce" Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.904716 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" podUID="88be297b-cdd1-4b8d-ae88-eb6219f0f156" Mar 20 09:17:47 crc kubenswrapper[4958]: I0320 09:17:47.975476 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:47 crc kubenswrapper[4958]: I0320 09:17:47.975585 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.975838 4958 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.975838 4958 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.975920 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:18:03.975895199 +0000 UTC m=+1104.297911157 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "metrics-server-cert" not found Mar 20 09:17:47 crc kubenswrapper[4958]: E0320 09:17:47.975949 4958 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs podName:90e05567-054f-41de-a1b4-4dc11ae039db nodeName:}" failed. No retries permitted until 2026-03-20 09:18:03.97594125 +0000 UTC m=+1104.297957198 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs") pod "openstack-operator-controller-manager-55958644c4-qr9t7" (UID: "90e05567-054f-41de-a1b4-4dc11ae039db") : secret "webhook-server-cert" not found Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.170307 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.170704 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-45h52,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-b8zbp_openstack-operators(22ddf7c6-5d86-436a-b6ea-a622e854725e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.171912 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" podUID="22ddf7c6-5d86-436a-b6ea-a622e854725e" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.831455 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.831870 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-v7k4m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-pg9qm_openstack-operators(70f92bb8-0cc8-4804-a8d9-d5d3441e953e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.833512 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" podUID="70f92bb8-0cc8-4804-a8d9-d5d3441e953e" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.907197 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" podUID="70f92bb8-0cc8-4804-a8d9-d5d3441e953e" Mar 20 09:17:48 crc kubenswrapper[4958]: E0320 09:17:48.907443 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" podUID="22ddf7c6-5d86-436a-b6ea-a622e854725e" Mar 20 09:17:49 crc kubenswrapper[4958]: E0320 09:17:49.734948 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da" Mar 20 09:17:49 crc kubenswrapper[4958]: E0320 09:17:49.735199 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-krwnz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-ch6hb_openstack-operators(9d54ed62-2236-4fdc-9fdb-f2042817795e): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:17:49 crc kubenswrapper[4958]: E0320 09:17:49.737414 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" podUID="9d54ed62-2236-4fdc-9fdb-f2042817795e" Mar 20 09:17:49 crc kubenswrapper[4958]: E0320 09:17:49.914389 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" podUID="9d54ed62-2236-4fdc-9fdb-f2042817795e" Mar 20 09:17:50 crc kubenswrapper[4958]: E0320 09:17:50.446401 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56" Mar 20 09:17:50 crc kubenswrapper[4958]: E0320 09:17:50.446584 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-jhffs,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod keystone-operator-controller-manager-768b96df4c-fvr27_openstack-operators(46972026-e8fb-46c0-bd8a-93d33a1eaccd): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:17:50 crc kubenswrapper[4958]: E0320 09:17:50.447809 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" podUID="46972026-e8fb-46c0-bd8a-93d33a1eaccd" Mar 20 09:17:50 crc kubenswrapper[4958]: E0320 09:17:50.921404 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/keystone-operator@sha256:ec36a9083657587022f8471c9d5a71b87a7895398496e7fc546c73aa1eae4b56\\\"\"" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" podUID="46972026-e8fb-46c0-bd8a-93d33a1eaccd" Mar 20 09:17:51 crc kubenswrapper[4958]: I0320 09:17:51.931284 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" event={"ID":"4721bc9e-cb87-47df-a166-cdd08d38568d","Type":"ContainerStarted","Data":"929c232b57b2cf9802f2f7cc99fec05ef5d3556d5e1d186fab4c9a1e7156a5c8"} Mar 20 09:17:51 crc kubenswrapper[4958]: I0320 09:17:51.932119 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:17:51 crc kubenswrapper[4958]: I0320 09:17:51.950796 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" podStartSLOduration=2.884176607 podStartE2EDuration="20.950772979s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.944371099 +0000 UTC m=+1073.266387057" lastFinishedPulling="2026-03-20 09:17:51.010967471 +0000 UTC m=+1091.332983429" observedRunningTime="2026-03-20 09:17:51.946985945 +0000 UTC m=+1092.269001903" watchObservedRunningTime="2026-03-20 09:17:51.950772979 +0000 UTC m=+1092.272788937" Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.940446 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" event={"ID":"7246ddd6-d5b3-48a0-8581-42e5ff63f6eb","Type":"ContainerStarted","Data":"ff2ca32d6f9d9112217f03c484908d4de4018023dd3ef8ed1b947f9833386b9c"} Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.940858 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.942842 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" event={"ID":"afb56adf-873a-4757-90cb-62cc57e78669","Type":"ContainerStarted","Data":"8bddb77d5a0ad2c54bb43a0830e9395f79a70951d0658588af59ce3cb2ba43ce"} Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.943375 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.944765 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" event={"ID":"934a0099-92f4-4fd1-b910-28c8a0f50d1e","Type":"ContainerStarted","Data":"272316c228b177f80ba32b54268f6760e7c0e2d239e36306f64b58cd79bad2fd"} Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.945191 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.947831 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" event={"ID":"b381ba24-046d-4474-8581-6235812526a7","Type":"ContainerStarted","Data":"eb96e1727bf74aff52452dbed25bb35b94327d0daef440a1e6ad9c89fef1a570"} Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.947990 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.967968 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" podStartSLOduration=3.911137078 podStartE2EDuration="21.967944431s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.954992121 +0000 UTC m=+1073.277008079" lastFinishedPulling="2026-03-20 09:17:51.011799464 +0000 UTC m=+1091.333815432" observedRunningTime="2026-03-20 09:17:52.963329703 +0000 UTC m=+1093.285345661" watchObservedRunningTime="2026-03-20 09:17:52.967944431 +0000 UTC m=+1093.289960389" Mar 20 09:17:52 crc kubenswrapper[4958]: I0320 09:17:52.998131 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" podStartSLOduration=4.230963958 podStartE2EDuration="21.998109102s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.244441364 +0000 UTC m=+1073.566457322" lastFinishedPulling="2026-03-20 09:17:51.011586508 +0000 UTC m=+1091.333602466" observedRunningTime="2026-03-20 09:17:52.992303923 +0000 UTC m=+1093.314319881" watchObservedRunningTime="2026-03-20 09:17:52.998109102 +0000 UTC m=+1093.320125070" Mar 20 09:17:53 crc kubenswrapper[4958]: I0320 09:17:53.010015 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" podStartSLOduration=4.407740176 podStartE2EDuration="23.009988971s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.408740707 +0000 UTC m=+1072.730756665" lastFinishedPulling="2026-03-20 09:17:51.010989502 +0000 UTC m=+1091.333005460" observedRunningTime="2026-03-20 09:17:53.005492897 +0000 UTC m=+1093.327508865" watchObservedRunningTime="2026-03-20 09:17:53.009988971 +0000 UTC m=+1093.332004929" Mar 20 09:17:53 crc kubenswrapper[4958]: I0320 09:17:53.029895 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" podStartSLOduration=4.553984919 podStartE2EDuration="23.029871219s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.535703758 +0000 UTC m=+1072.857719716" lastFinishedPulling="2026-03-20 09:17:51.011590058 +0000 UTC m=+1091.333606016" observedRunningTime="2026-03-20 09:17:53.023870664 +0000 UTC m=+1093.345886622" watchObservedRunningTime="2026-03-20 09:17:53.029871219 +0000 UTC m=+1093.351887177" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.974507 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" event={"ID":"049aadcd-754d-4c89-b1cf-8ae3aa2f7748","Type":"ContainerStarted","Data":"096319aaef7a1af00cd6b69d1028deec8f4ea9f6273499ceae0f93811645f294"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.974947 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.975791 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" event={"ID":"668ba749-8ef8-42fc-bb13-7b5c6e207ed6","Type":"ContainerStarted","Data":"ac85b35f3a80d485f1a2417614b7339b34f718061c65e0cfbbb21ee2ede29446"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.976214 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.977956 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" event={"ID":"af8e40f1-7e87-4ed7-8136-1ec1ad714bac","Type":"ContainerStarted","Data":"bfe6ffb1c38718d304cb3674f2329bc85197043938b14bc28751006b8909033b"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.978279 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.980022 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" event={"ID":"60ab48da-f2e7-47d0-829e-922b0726e372","Type":"ContainerStarted","Data":"80981fc6c87647c5b32a13346479937db01f2f26912c9ead1529861c6d55866e"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.980756 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.982189 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" event={"ID":"b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc","Type":"ContainerStarted","Data":"e510c609ba6359850524e410b1f4f1bb5bffe05792217bf81867fd5ee3da23e6"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.982804 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.984204 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" event={"ID":"1dc86ca0-19a7-44f2-90f4-40faf6f6308a","Type":"ContainerStarted","Data":"b1cba6bc2840bfd458a2d098a22cd4cefc5aecf9ff2e11d55306b29afb6f3515"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.984278 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.989139 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" event={"ID":"6db78af7-a32c-44b8-8450-d9478c3f9b1f","Type":"ContainerStarted","Data":"b65ce9fff656d5772d0e373aa7fc71c2a4352beb0c6f29f38ce5970cfeeb5e62"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.990689 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.993883 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" event={"ID":"21dbcd45-579e-42ed-a2ac-c0b9fc9482b8","Type":"ContainerStarted","Data":"3f534dc404bf2568477b46fc0f359ffe12baa58bd447577b53e8d02acc18ea8d"} Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.994275 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:17:54 crc kubenswrapper[4958]: I0320 09:17:54.995834 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" podStartSLOduration=6.240509198 podStartE2EDuration="23.995821356s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.257030541 +0000 UTC m=+1073.579046499" lastFinishedPulling="2026-03-20 09:17:51.012342699 +0000 UTC m=+1091.334358657" observedRunningTime="2026-03-20 09:17:54.987077395 +0000 UTC m=+1095.309093353" watchObservedRunningTime="2026-03-20 09:17:54.995821356 +0000 UTC m=+1095.317837314" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.005998 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" podStartSLOduration=6.091711794 podStartE2EDuration="24.005974305s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.098746776 +0000 UTC m=+1073.420762734" lastFinishedPulling="2026-03-20 09:17:51.013009287 +0000 UTC m=+1091.335025245" observedRunningTime="2026-03-20 09:17:55.003425325 +0000 UTC m=+1095.325441283" watchObservedRunningTime="2026-03-20 09:17:55.005974305 +0000 UTC m=+1095.327990263" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.033677 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" podStartSLOduration=4.644221128 podStartE2EDuration="25.033658269s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.33928191 +0000 UTC m=+1073.661297868" lastFinishedPulling="2026-03-20 09:17:53.728719051 +0000 UTC m=+1094.050735009" observedRunningTime="2026-03-20 09:17:55.032836126 +0000 UTC m=+1095.354852084" watchObservedRunningTime="2026-03-20 09:17:55.033658269 +0000 UTC m=+1095.355674227" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.056854 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" podStartSLOduration=6.761242151 podStartE2EDuration="25.056831759s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.71709052 +0000 UTC m=+1073.039106468" lastFinishedPulling="2026-03-20 09:17:51.012680098 +0000 UTC m=+1091.334696076" observedRunningTime="2026-03-20 09:17:55.050921815 +0000 UTC m=+1095.372937773" watchObservedRunningTime="2026-03-20 09:17:55.056831759 +0000 UTC m=+1095.378847717" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.079975 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" podStartSLOduration=3.764039542 podStartE2EDuration="24.079953636s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.360429583 +0000 UTC m=+1073.682445541" lastFinishedPulling="2026-03-20 09:17:53.676343677 +0000 UTC m=+1093.998359635" observedRunningTime="2026-03-20 09:17:55.072556642 +0000 UTC m=+1095.394572600" watchObservedRunningTime="2026-03-20 09:17:55.079953636 +0000 UTC m=+1095.401969594" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.095525 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" podStartSLOduration=6.748265941 podStartE2EDuration="25.095498084s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.662526175 +0000 UTC m=+1072.984542123" lastFinishedPulling="2026-03-20 09:17:51.009758308 +0000 UTC m=+1091.331774266" observedRunningTime="2026-03-20 09:17:55.085459948 +0000 UTC m=+1095.407475906" watchObservedRunningTime="2026-03-20 09:17:55.095498084 +0000 UTC m=+1095.417514042" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.109443 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" podStartSLOduration=3.7575257520000003 podStartE2EDuration="24.109424709s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.361333977 +0000 UTC m=+1073.683349935" lastFinishedPulling="2026-03-20 09:17:53.713232924 +0000 UTC m=+1094.035248892" observedRunningTime="2026-03-20 09:17:55.106222211 +0000 UTC m=+1095.428238169" watchObservedRunningTime="2026-03-20 09:17:55.109424709 +0000 UTC m=+1095.431440657" Mar 20 09:17:55 crc kubenswrapper[4958]: I0320 09:17:55.130044 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" podStartSLOduration=3.764263138 podStartE2EDuration="24.130020327s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.361631316 +0000 UTC m=+1073.683647274" lastFinishedPulling="2026-03-20 09:17:53.727388505 +0000 UTC m=+1094.049404463" observedRunningTime="2026-03-20 09:17:55.129220464 +0000 UTC m=+1095.451236422" watchObservedRunningTime="2026-03-20 09:17:55.130020327 +0000 UTC m=+1095.452036285" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.153378 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566638-r6jc2"] Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.156553 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.159465 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.159866 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.159873 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.172356 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-r6jc2"] Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.286567 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vbbz\" (UniqueName: \"kubernetes.io/projected/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf-kube-api-access-8vbbz\") pod \"auto-csr-approver-29566638-r6jc2\" (UID: \"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf\") " pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.388578 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vbbz\" (UniqueName: \"kubernetes.io/projected/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf-kube-api-access-8vbbz\") pod \"auto-csr-approver-29566638-r6jc2\" (UID: \"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf\") " pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.413920 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vbbz\" (UniqueName: \"kubernetes.io/projected/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf-kube-api-access-8vbbz\") pod \"auto-csr-approver-29566638-r6jc2\" (UID: \"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf\") " pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.480945 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:00 crc kubenswrapper[4958]: I0320 09:18:00.968228 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-r6jc2"] Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.066130 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" event={"ID":"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf","Type":"ContainerStarted","Data":"10e54ea25bf8f2d1117716b1fbc7c8b47a4512ade56f8a459d51596b2b7559f3"} Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.072764 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" event={"ID":"88be297b-cdd1-4b8d-ae88-eb6219f0f156","Type":"ContainerStarted","Data":"f23b277adb753a3aac01f8f7b9270d7339286882f9bed354a853d1ae25c33d8e"} Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.073914 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.099295 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" podStartSLOduration=3.287304574 podStartE2EDuration="30.099267017s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.050795204 +0000 UTC m=+1073.372811162" lastFinishedPulling="2026-03-20 09:17:59.862757647 +0000 UTC m=+1100.184773605" observedRunningTime="2026-03-20 09:18:01.093418716 +0000 UTC m=+1101.415434684" watchObservedRunningTime="2026-03-20 09:18:01.099267017 +0000 UTC m=+1101.421282975" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.296080 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-tlzr6" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.302556 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-4ljl2" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.393966 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-2f897" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.487929 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-wq2w4" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.702702 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-5572j" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.740416 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-qfwqm" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.789549 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-p95zp" Mar 20 09:18:01 crc kubenswrapper[4958]: I0320 09:18:01.854621 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-bqxpp" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.086652 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" event={"ID":"07df28d7-7683-4309-bee9-9aa2de96b9ce","Type":"ContainerStarted","Data":"bb2d03d767039e0d7d2103730150e25c502e54208c45534f7880cc47d711d921"} Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.087376 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.092401 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" event={"ID":"22ddf7c6-5d86-436a-b6ea-a622e854725e","Type":"ContainerStarted","Data":"62f95a9949ee8e365f12d56e7f7bed7df3ae9497eca43a699272d911c606035e"} Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.092786 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.112403 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" podStartSLOduration=3.732914934 podStartE2EDuration="32.112382947s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.537049445 +0000 UTC m=+1072.859065403" lastFinishedPulling="2026-03-20 09:18:00.916517468 +0000 UTC m=+1101.238533416" observedRunningTime="2026-03-20 09:18:02.107836182 +0000 UTC m=+1102.429852150" watchObservedRunningTime="2026-03-20 09:18:02.112382947 +0000 UTC m=+1102.434398905" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.132983 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" podStartSLOduration=3.507264341 podStartE2EDuration="32.132958134s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.319525946 +0000 UTC m=+1072.641541904" lastFinishedPulling="2026-03-20 09:18:00.945219739 +0000 UTC m=+1101.267235697" observedRunningTime="2026-03-20 09:18:02.128428799 +0000 UTC m=+1102.450444747" watchObservedRunningTime="2026-03-20 09:18:02.132958134 +0000 UTC m=+1102.454974092" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.183885 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-pfz7r" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.241044 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-d8b2d" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.252022 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-glfmx" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.258868 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-69br5" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.538627 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-tl8ls" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.949485 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:18:02 crc kubenswrapper[4958]: I0320 09:18:02.959206 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/6d3c18bd-2666-4490-afbb-dbb844e5dc36-cert\") pod \"infra-operator-controller-manager-577ccd856-pms6v\" (UID: \"6d3c18bd-2666-4490-afbb-dbb844e5dc36\") " pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.053828 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-48rbz" Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.062398 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.105495 4958 generic.go:334] "Generic (PLEG): container finished" podID="060f0faa-4ff1-4f25-9354-ee90f8f7ccbf" containerID="3a183d4183ed1edad4292f0f5e3e7bfbedb6cc7ca0d4c551346315f9da4daba0" exitCode=0 Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.105553 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" event={"ID":"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf","Type":"ContainerDied","Data":"3a183d4183ed1edad4292f0f5e3e7bfbedb6cc7ca0d4c551346315f9da4daba0"} Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.502498 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-577ccd856-pms6v"] Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.663170 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.669411 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/58536825-54ec-4942-a17e-50d7db114ff9-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-fb9dm\" (UID: \"58536825-54ec-4942-a17e-50d7db114ff9\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.885254 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-kt85q" Mar 20 09:18:03 crc kubenswrapper[4958]: I0320 09:18:03.893927 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.069664 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.069787 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.085618 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-webhook-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.086202 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/90e05567-054f-41de-a1b4-4dc11ae039db-metrics-certs\") pod \"openstack-operator-controller-manager-55958644c4-qr9t7\" (UID: \"90e05567-054f-41de-a1b4-4dc11ae039db\") " pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.134177 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" event={"ID":"46972026-e8fb-46c0-bd8a-93d33a1eaccd","Type":"ContainerStarted","Data":"3de3e90e0659d753412cee035807e9f8587e553dd6959dbfb5915310230a1709"} Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.134942 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.140237 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" event={"ID":"6d3c18bd-2666-4490-afbb-dbb844e5dc36","Type":"ContainerStarted","Data":"9853951f5ba6284dd542aaed8565bd669c5cba067f5aaa58fa9ea0d5d6825cf8"} Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.199097 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" podStartSLOduration=4.030563543 podStartE2EDuration="34.199077634s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:17:32.810383243 +0000 UTC m=+1073.132399201" lastFinishedPulling="2026-03-20 09:18:02.978897344 +0000 UTC m=+1103.300913292" observedRunningTime="2026-03-20 09:18:04.192693549 +0000 UTC m=+1104.514709507" watchObservedRunningTime="2026-03-20 09:18:04.199077634 +0000 UTC m=+1104.521093592" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.367992 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-vj7hz" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.379544 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.525114 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.586439 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm"] Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.688370 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vbbz\" (UniqueName: \"kubernetes.io/projected/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf-kube-api-access-8vbbz\") pod \"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf\" (UID: \"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf\") " Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.695938 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf-kube-api-access-8vbbz" (OuterVolumeSpecName: "kube-api-access-8vbbz") pod "060f0faa-4ff1-4f25-9354-ee90f8f7ccbf" (UID: "060f0faa-4ff1-4f25-9354-ee90f8f7ccbf"). InnerVolumeSpecName "kube-api-access-8vbbz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.754546 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7"] Mar 20 09:18:04 crc kubenswrapper[4958]: I0320 09:18:04.790500 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vbbz\" (UniqueName: \"kubernetes.io/projected/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf-kube-api-access-8vbbz\") on node \"crc\" DevicePath \"\"" Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.148779 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.149214 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566638-r6jc2" event={"ID":"060f0faa-4ff1-4f25-9354-ee90f8f7ccbf","Type":"ContainerDied","Data":"10e54ea25bf8f2d1117716b1fbc7c8b47a4512ade56f8a459d51596b2b7559f3"} Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.151919 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="10e54ea25bf8f2d1117716b1fbc7c8b47a4512ade56f8a459d51596b2b7559f3" Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.156205 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" event={"ID":"58536825-54ec-4942-a17e-50d7db114ff9","Type":"ContainerStarted","Data":"bc5e082d6082aa181e03d9b63957e33a3f521f030bc1250e28441529d448b9ad"} Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.158397 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" event={"ID":"90e05567-054f-41de-a1b4-4dc11ae039db","Type":"ContainerStarted","Data":"cb5cd3a938029ca386cd06effdec88b81f9cf08d714ea4115f5d035a48e62a2a"} Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.592517 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-nvgj8"] Mar 20 09:18:05 crc kubenswrapper[4958]: I0320 09:18:05.596706 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566632-nvgj8"] Mar 20 09:18:06 crc kubenswrapper[4958]: I0320 09:18:06.445772 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65abaa7b-f291-4255-b84c-29352c3e6ea0" path="/var/lib/kubelet/pods/65abaa7b-f291-4255-b84c-29352c3e6ea0/volumes" Mar 20 09:18:11 crc kubenswrapper[4958]: I0320 09:18:11.160616 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-b8zbp" Mar 20 09:18:11 crc kubenswrapper[4958]: I0320 09:18:11.353828 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-j4w4r" Mar 20 09:18:11 crc kubenswrapper[4958]: I0320 09:18:11.521804 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-fvr27" Mar 20 09:18:11 crc kubenswrapper[4958]: I0320 09:18:11.961494 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-llgf2" Mar 20 09:18:14 crc kubenswrapper[4958]: I0320 09:18:14.220168 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" event={"ID":"90e05567-054f-41de-a1b4-4dc11ae039db","Type":"ContainerStarted","Data":"6b9bf105cf91a8a5900f19bfb0a32fb0d3d79b2c050433421e6062ee8dad7cb2"} Mar 20 09:18:14 crc kubenswrapper[4958]: I0320 09:18:14.220334 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:14 crc kubenswrapper[4958]: I0320 09:18:14.250730 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" podStartSLOduration=43.25071177 podStartE2EDuration="43.25071177s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:18:14.245689522 +0000 UTC m=+1114.567705480" watchObservedRunningTime="2026-03-20 09:18:14.25071177 +0000 UTC m=+1114.572727728" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.257021 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" event={"ID":"9d54ed62-2236-4fdc-9fdb-f2042817795e","Type":"ContainerStarted","Data":"f6d913573d894892e1cccd9b4794cd2734f374412964e93a22deba272d9fffcb"} Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.257776 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.258526 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" event={"ID":"70f92bb8-0cc8-4804-a8d9-d5d3441e953e","Type":"ContainerStarted","Data":"fb0c2e09863b8be1abea98cebbf167b6ae4cd21542caa77bd8b77a76c99daa96"} Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.258801 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.259936 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" event={"ID":"58536825-54ec-4942-a17e-50d7db114ff9","Type":"ContainerStarted","Data":"db0be77e3878558f0399059b82f056c423a4a80aebc99ec826104a3c3db7fdab"} Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.260049 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.261153 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" event={"ID":"6d3c18bd-2666-4490-afbb-dbb844e5dc36","Type":"ContainerStarted","Data":"4be8585f698e059490d6a9d1fc30293a3a4d394c5c9f5170df5b2abffd927380"} Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.261317 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.295348 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" podStartSLOduration=3.004819243 podStartE2EDuration="47.295328213s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.338998221 +0000 UTC m=+1073.661014179" lastFinishedPulling="2026-03-20 09:18:17.629507191 +0000 UTC m=+1117.951523149" observedRunningTime="2026-03-20 09:18:18.280286128 +0000 UTC m=+1118.602302096" watchObservedRunningTime="2026-03-20 09:18:18.295328213 +0000 UTC m=+1118.617344171" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.300483 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" podStartSLOduration=34.101462322 podStartE2EDuration="48.300466324s" podCreationTimestamp="2026-03-20 09:17:30 +0000 UTC" firstStartedPulling="2026-03-20 09:18:03.514792803 +0000 UTC m=+1103.836808761" lastFinishedPulling="2026-03-20 09:18:17.713796795 +0000 UTC m=+1118.035812763" observedRunningTime="2026-03-20 09:18:18.293791161 +0000 UTC m=+1118.615807119" watchObservedRunningTime="2026-03-20 09:18:18.300466324 +0000 UTC m=+1118.622482282" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.320894 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" podStartSLOduration=34.249958477 podStartE2EDuration="47.320871968s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:18:04.603499718 +0000 UTC m=+1104.925515676" lastFinishedPulling="2026-03-20 09:18:17.674413219 +0000 UTC m=+1117.996429167" observedRunningTime="2026-03-20 09:18:18.320076246 +0000 UTC m=+1118.642092204" watchObservedRunningTime="2026-03-20 09:18:18.320871968 +0000 UTC m=+1118.642887926" Mar 20 09:18:18 crc kubenswrapper[4958]: I0320 09:18:18.351081 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" podStartSLOduration=2.962672501 podStartE2EDuration="47.35105465s" podCreationTimestamp="2026-03-20 09:17:31 +0000 UTC" firstStartedPulling="2026-03-20 09:17:33.243489667 +0000 UTC m=+1073.565505625" lastFinishedPulling="2026-03-20 09:18:17.631871816 +0000 UTC m=+1117.953887774" observedRunningTime="2026-03-20 09:18:18.346045162 +0000 UTC m=+1118.668061120" watchObservedRunningTime="2026-03-20 09:18:18.35105465 +0000 UTC m=+1118.673070608" Mar 20 09:18:22 crc kubenswrapper[4958]: I0320 09:18:22.039376 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-pg9qm" Mar 20 09:18:22 crc kubenswrapper[4958]: I0320 09:18:22.346026 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-ch6hb" Mar 20 09:18:23 crc kubenswrapper[4958]: I0320 09:18:23.070772 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-577ccd856-pms6v" Mar 20 09:18:23 crc kubenswrapper[4958]: I0320 09:18:23.902407 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-fb9dm" Mar 20 09:18:24 crc kubenswrapper[4958]: I0320 09:18:24.388715 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-55958644c4-qr9t7" Mar 20 09:18:26 crc kubenswrapper[4958]: I0320 09:18:26.520941 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:18:26 crc kubenswrapper[4958]: I0320 09:18:26.521481 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.803884 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p4lw8"] Mar 20 09:18:40 crc kubenswrapper[4958]: E0320 09:18:40.804900 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="060f0faa-4ff1-4f25-9354-ee90f8f7ccbf" containerName="oc" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.804919 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="060f0faa-4ff1-4f25-9354-ee90f8f7ccbf" containerName="oc" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.805089 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="060f0faa-4ff1-4f25-9354-ee90f8f7ccbf" containerName="oc" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.805886 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.809029 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.809445 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-pvnv4" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.809587 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.810660 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.816400 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p4lw8"] Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.888029 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bshzx"] Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.889135 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.891693 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.914447 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bshzx"] Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.930173 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7jdx\" (UniqueName: \"kubernetes.io/projected/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-kube-api-access-w7jdx\") pod \"dnsmasq-dns-675f4bcbfc-p4lw8\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:40 crc kubenswrapper[4958]: I0320 09:18:40.930247 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-config\") pod \"dnsmasq-dns-675f4bcbfc-p4lw8\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.031319 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3958399-d780-4806-ae2c-2a2479b6d911-config\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.031405 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9jwr\" (UniqueName: \"kubernetes.io/projected/f3958399-d780-4806-ae2c-2a2479b6d911-kube-api-access-w9jwr\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.031443 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7jdx\" (UniqueName: \"kubernetes.io/projected/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-kube-api-access-w7jdx\") pod \"dnsmasq-dns-675f4bcbfc-p4lw8\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.031470 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-config\") pod \"dnsmasq-dns-675f4bcbfc-p4lw8\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.031505 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3958399-d780-4806-ae2c-2a2479b6d911-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.032781 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-config\") pod \"dnsmasq-dns-675f4bcbfc-p4lw8\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.064890 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7jdx\" (UniqueName: \"kubernetes.io/projected/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-kube-api-access-w7jdx\") pod \"dnsmasq-dns-675f4bcbfc-p4lw8\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.129013 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.132550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3958399-d780-4806-ae2c-2a2479b6d911-config\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.132619 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9jwr\" (UniqueName: \"kubernetes.io/projected/f3958399-d780-4806-ae2c-2a2479b6d911-kube-api-access-w9jwr\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.132673 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3958399-d780-4806-ae2c-2a2479b6d911-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.133824 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/f3958399-d780-4806-ae2c-2a2479b6d911-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.134377 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f3958399-d780-4806-ae2c-2a2479b6d911-config\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.160434 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9jwr\" (UniqueName: \"kubernetes.io/projected/f3958399-d780-4806-ae2c-2a2479b6d911-kube-api-access-w9jwr\") pod \"dnsmasq-dns-78dd6ddcc-bshzx\" (UID: \"f3958399-d780-4806-ae2c-2a2479b6d911\") " pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.208301 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.685839 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p4lw8"] Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.695128 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:18:41 crc kubenswrapper[4958]: I0320 09:18:41.773687 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-bshzx"] Mar 20 09:18:41 crc kubenswrapper[4958]: W0320 09:18:41.780341 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf3958399_d780_4806_ae2c_2a2479b6d911.slice/crio-9b64cdcae7d855b0935b9bc8a6bf978cff4ffa9f3d02172fbfccfab3bed46c1b WatchSource:0}: Error finding container 9b64cdcae7d855b0935b9bc8a6bf978cff4ffa9f3d02172fbfccfab3bed46c1b: Status 404 returned error can't find the container with id 9b64cdcae7d855b0935b9bc8a6bf978cff4ffa9f3d02172fbfccfab3bed46c1b Mar 20 09:18:42 crc kubenswrapper[4958]: I0320 09:18:42.446323 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" event={"ID":"9732fa94-74d3-4a60-807a-e4d2eb4c64e0","Type":"ContainerStarted","Data":"a29fae701cbe8ea41309e6e5ce7f10584d67304467700bc5ca9047636b0bd96f"} Mar 20 09:18:42 crc kubenswrapper[4958]: I0320 09:18:42.446369 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" event={"ID":"f3958399-d780-4806-ae2c-2a2479b6d911","Type":"ContainerStarted","Data":"9b64cdcae7d855b0935b9bc8a6bf978cff4ffa9f3d02172fbfccfab3bed46c1b"} Mar 20 09:18:43 crc kubenswrapper[4958]: I0320 09:18:43.207754 4958 scope.go:117] "RemoveContainer" containerID="345d2342735db1e9c95407176c092de85b7fbb08e026fc7f81f9165c146d8d53" Mar 20 09:18:56 crc kubenswrapper[4958]: I0320 09:18:56.521245 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:18:56 crc kubenswrapper[4958]: I0320 09:18:56.522638 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.223629 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.224171 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w9jwr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-bshzx_openstack(f3958399-d780-4806-ae2c-2a2479b6d911): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.225471 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" podUID="f3958399-d780-4806-ae2c-2a2479b6d911" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.239144 4958 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.239306 4958 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w7jdx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-p4lw8_openstack(9732fa94-74d3-4a60-807a-e4d2eb4c64e0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.240515 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.611409 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" Mar 20 09:19:01 crc kubenswrapper[4958]: E0320 09:19:01.611442 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" podUID="f3958399-d780-4806-ae2c-2a2479b6d911" Mar 20 09:19:15 crc kubenswrapper[4958]: I0320 09:19:15.723336 4958 generic.go:334] "Generic (PLEG): container finished" podID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerID="77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673" exitCode=0 Mar 20 09:19:15 crc kubenswrapper[4958]: I0320 09:19:15.723428 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" event={"ID":"9732fa94-74d3-4a60-807a-e4d2eb4c64e0","Type":"ContainerDied","Data":"77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673"} Mar 20 09:19:16 crc kubenswrapper[4958]: I0320 09:19:16.734941 4958 generic.go:334] "Generic (PLEG): container finished" podID="f3958399-d780-4806-ae2c-2a2479b6d911" containerID="dec3601b09932947b12016401d6779c3a71fb5e7b114a61c2ea8c8caab3b6468" exitCode=0 Mar 20 09:19:16 crc kubenswrapper[4958]: I0320 09:19:16.735066 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" event={"ID":"f3958399-d780-4806-ae2c-2a2479b6d911","Type":"ContainerDied","Data":"dec3601b09932947b12016401d6779c3a71fb5e7b114a61c2ea8c8caab3b6468"} Mar 20 09:19:16 crc kubenswrapper[4958]: I0320 09:19:16.741748 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" event={"ID":"9732fa94-74d3-4a60-807a-e4d2eb4c64e0","Type":"ContainerStarted","Data":"e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81"} Mar 20 09:19:16 crc kubenswrapper[4958]: I0320 09:19:16.741970 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:19:16 crc kubenswrapper[4958]: I0320 09:19:16.777099 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" podStartSLOduration=3.531923794 podStartE2EDuration="36.777070045s" podCreationTimestamp="2026-03-20 09:18:40 +0000 UTC" firstStartedPulling="2026-03-20 09:18:41.694883652 +0000 UTC m=+1142.016899610" lastFinishedPulling="2026-03-20 09:19:14.940029903 +0000 UTC m=+1175.262045861" observedRunningTime="2026-03-20 09:19:16.771075159 +0000 UTC m=+1177.093091127" watchObservedRunningTime="2026-03-20 09:19:16.777070045 +0000 UTC m=+1177.099086013" Mar 20 09:19:17 crc kubenswrapper[4958]: I0320 09:19:17.752344 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" event={"ID":"f3958399-d780-4806-ae2c-2a2479b6d911","Type":"ContainerStarted","Data":"91c54cc4ef14f4fff18460b708be131bce64633fe813552c51084d3545f739c0"} Mar 20 09:19:17 crc kubenswrapper[4958]: I0320 09:19:17.753351 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:19:17 crc kubenswrapper[4958]: I0320 09:19:17.785173 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" podStartSLOduration=-9223371999.069633 podStartE2EDuration="37.785141787s" podCreationTimestamp="2026-03-20 09:18:40 +0000 UTC" firstStartedPulling="2026-03-20 09:18:41.782683453 +0000 UTC m=+1142.104699411" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 09:19:17.77947158 +0000 UTC m=+1178.101487538" watchObservedRunningTime="2026-03-20 09:19:17.785141787 +0000 UTC m=+1178.107157745" Mar 20 09:19:21 crc kubenswrapper[4958]: I0320 09:19:21.130943 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:19:21 crc kubenswrapper[4958]: I0320 09:19:21.210766 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dd6ddcc-bshzx" Mar 20 09:19:21 crc kubenswrapper[4958]: I0320 09:19:21.271678 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p4lw8"] Mar 20 09:19:21 crc kubenswrapper[4958]: I0320 09:19:21.790001 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerName="dnsmasq-dns" containerID="cri-o://e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81" gracePeriod=10 Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.291139 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.424569 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7jdx\" (UniqueName: \"kubernetes.io/projected/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-kube-api-access-w7jdx\") pod \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.424660 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-config\") pod \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\" (UID: \"9732fa94-74d3-4a60-807a-e4d2eb4c64e0\") " Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.432590 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-kube-api-access-w7jdx" (OuterVolumeSpecName: "kube-api-access-w7jdx") pod "9732fa94-74d3-4a60-807a-e4d2eb4c64e0" (UID: "9732fa94-74d3-4a60-807a-e4d2eb4c64e0"). InnerVolumeSpecName "kube-api-access-w7jdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.475659 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-config" (OuterVolumeSpecName: "config") pod "9732fa94-74d3-4a60-807a-e4d2eb4c64e0" (UID: "9732fa94-74d3-4a60-807a-e4d2eb4c64e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.526958 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7jdx\" (UniqueName: \"kubernetes.io/projected/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-kube-api-access-w7jdx\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.526999 4958 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9732fa94-74d3-4a60-807a-e4d2eb4c64e0-config\") on node \"crc\" DevicePath \"\"" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.802203 4958 generic.go:334] "Generic (PLEG): container finished" podID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerID="e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81" exitCode=0 Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.802381 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" event={"ID":"9732fa94-74d3-4a60-807a-e4d2eb4c64e0","Type":"ContainerDied","Data":"e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81"} Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.802629 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" event={"ID":"9732fa94-74d3-4a60-807a-e4d2eb4c64e0","Type":"ContainerDied","Data":"a29fae701cbe8ea41309e6e5ce7f10584d67304467700bc5ca9047636b0bd96f"} Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.802649 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-p4lw8" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.802746 4958 scope.go:117] "RemoveContainer" containerID="e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.840763 4958 scope.go:117] "RemoveContainer" containerID="77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.846775 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p4lw8"] Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.855122 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-p4lw8"] Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.865089 4958 scope.go:117] "RemoveContainer" containerID="e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81" Mar 20 09:19:22 crc kubenswrapper[4958]: E0320 09:19:22.865898 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81\": container with ID starting with e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81 not found: ID does not exist" containerID="e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.866048 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81"} err="failed to get container status \"e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81\": rpc error: code = NotFound desc = could not find container \"e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81\": container with ID starting with e313e464c2ce252dbc38711527bea2b710f05d8c0200e17c2e0873ff81961e81 not found: ID does not exist" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.866238 4958 scope.go:117] "RemoveContainer" containerID="77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673" Mar 20 09:19:22 crc kubenswrapper[4958]: E0320 09:19:22.866901 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673\": container with ID starting with 77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673 not found: ID does not exist" containerID="77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673" Mar 20 09:19:22 crc kubenswrapper[4958]: I0320 09:19:22.866966 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673"} err="failed to get container status \"77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673\": rpc error: code = NotFound desc = could not find container \"77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673\": container with ID starting with 77b0e776859ff61405ccb30fea44f25e3be0699aab9b3dec0d5273ce197be673 not found: ID does not exist" Mar 20 09:19:24 crc kubenswrapper[4958]: I0320 09:19:24.446706 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" path="/var/lib/kubelet/pods/9732fa94-74d3-4a60-807a-e4d2eb4c64e0/volumes" Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.521760 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.522328 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.522383 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.523128 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"007b6668849ff989fcaab0fedbd591707a471a4800519c18e47480ba1f688088"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.523184 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://007b6668849ff989fcaab0fedbd591707a471a4800519c18e47480ba1f688088" gracePeriod=600 Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.837016 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="007b6668849ff989fcaab0fedbd591707a471a4800519c18e47480ba1f688088" exitCode=0 Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.837063 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"007b6668849ff989fcaab0fedbd591707a471a4800519c18e47480ba1f688088"} Mar 20 09:19:26 crc kubenswrapper[4958]: I0320 09:19:26.837098 4958 scope.go:117] "RemoveContainer" containerID="d50121cef1dafbc948002311d0250ee4e915179ff897da522e2cdd9606be5fc6" Mar 20 09:19:27 crc kubenswrapper[4958]: I0320 09:19:27.860955 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"6968d552cba3a45e5d78b3f461ade07ff40ad061c61889b404b41479bd961744"} Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.153432 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566640-x9skg"] Mar 20 09:20:00 crc kubenswrapper[4958]: E0320 09:20:00.154942 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerName="init" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.154987 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerName="init" Mar 20 09:20:00 crc kubenswrapper[4958]: E0320 09:20:00.155024 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerName="dnsmasq-dns" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.155037 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerName="dnsmasq-dns" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.155307 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9732fa94-74d3-4a60-807a-e4d2eb4c64e0" containerName="dnsmasq-dns" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.156269 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.159254 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.159258 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.159663 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.170340 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-x9skg"] Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.292738 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4g6f\" (UniqueName: \"kubernetes.io/projected/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79-kube-api-access-b4g6f\") pod \"auto-csr-approver-29566640-x9skg\" (UID: \"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79\") " pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.394116 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4g6f\" (UniqueName: \"kubernetes.io/projected/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79-kube-api-access-b4g6f\") pod \"auto-csr-approver-29566640-x9skg\" (UID: \"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79\") " pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.422755 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4g6f\" (UniqueName: \"kubernetes.io/projected/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79-kube-api-access-b4g6f\") pod \"auto-csr-approver-29566640-x9skg\" (UID: \"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79\") " pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.476202 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:00 crc kubenswrapper[4958]: I0320 09:20:00.977628 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-x9skg"] Mar 20 09:20:01 crc kubenswrapper[4958]: I0320 09:20:01.167941 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566640-x9skg" event={"ID":"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79","Type":"ContainerStarted","Data":"abdb7602b07cb702b67376a36accc3f5eb543c78d2897ea4b3d237b46b52317b"} Mar 20 09:20:03 crc kubenswrapper[4958]: I0320 09:20:03.187580 4958 generic.go:334] "Generic (PLEG): container finished" podID="a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79" containerID="3d56647e17b75d6ca56242719838964b4d7336c7b3019bacf60c8a186696be0b" exitCode=0 Mar 20 09:20:03 crc kubenswrapper[4958]: I0320 09:20:03.187764 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566640-x9skg" event={"ID":"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79","Type":"ContainerDied","Data":"3d56647e17b75d6ca56242719838964b4d7336c7b3019bacf60c8a186696be0b"} Mar 20 09:20:04 crc kubenswrapper[4958]: I0320 09:20:04.477187 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:04 crc kubenswrapper[4958]: I0320 09:20:04.562545 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b4g6f\" (UniqueName: \"kubernetes.io/projected/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79-kube-api-access-b4g6f\") pod \"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79\" (UID: \"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79\") " Mar 20 09:20:04 crc kubenswrapper[4958]: I0320 09:20:04.569633 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79-kube-api-access-b4g6f" (OuterVolumeSpecName: "kube-api-access-b4g6f") pod "a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79" (UID: "a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79"). InnerVolumeSpecName "kube-api-access-b4g6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:20:04 crc kubenswrapper[4958]: I0320 09:20:04.664772 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b4g6f\" (UniqueName: \"kubernetes.io/projected/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79-kube-api-access-b4g6f\") on node \"crc\" DevicePath \"\"" Mar 20 09:20:05 crc kubenswrapper[4958]: I0320 09:20:05.206433 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566640-x9skg" event={"ID":"a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79","Type":"ContainerDied","Data":"abdb7602b07cb702b67376a36accc3f5eb543c78d2897ea4b3d237b46b52317b"} Mar 20 09:20:05 crc kubenswrapper[4958]: I0320 09:20:05.206820 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="abdb7602b07cb702b67376a36accc3f5eb543c78d2897ea4b3d237b46b52317b" Mar 20 09:20:05 crc kubenswrapper[4958]: I0320 09:20:05.206516 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566640-x9skg" Mar 20 09:20:05 crc kubenswrapper[4958]: I0320 09:20:05.564007 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-k4tm9"] Mar 20 09:20:05 crc kubenswrapper[4958]: I0320 09:20:05.569729 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566634-k4tm9"] Mar 20 09:20:06 crc kubenswrapper[4958]: I0320 09:20:06.443533 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3481c9df-80a0-42c9-a2c3-ba845e0f14c0" path="/var/lib/kubelet/pods/3481c9df-80a0-42c9-a2c3-ba845e0f14c0/volumes" Mar 20 09:20:43 crc kubenswrapper[4958]: I0320 09:20:43.383378 4958 scope.go:117] "RemoveContainer" containerID="4cce592f1c1354f99af4d2e887753ac54bcaf92082b1fb9167af7935ed89bdbb" Mar 20 09:21:56 crc kubenswrapper[4958]: I0320 09:21:56.521687 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:21:56 crc kubenswrapper[4958]: I0320 09:21:56.522542 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.151334 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566642-cz2m9"] Mar 20 09:22:00 crc kubenswrapper[4958]: E0320 09:22:00.152576 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79" containerName="oc" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.152622 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79" containerName="oc" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.152889 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79" containerName="oc" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.153721 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.156172 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.156446 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.156648 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.159805 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-cz2m9"] Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.234107 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mf628\" (UniqueName: \"kubernetes.io/projected/d221edd0-8dd5-41f9-b864-43f1f91b3f77-kube-api-access-mf628\") pod \"auto-csr-approver-29566642-cz2m9\" (UID: \"d221edd0-8dd5-41f9-b864-43f1f91b3f77\") " pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.335684 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mf628\" (UniqueName: \"kubernetes.io/projected/d221edd0-8dd5-41f9-b864-43f1f91b3f77-kube-api-access-mf628\") pod \"auto-csr-approver-29566642-cz2m9\" (UID: \"d221edd0-8dd5-41f9-b864-43f1f91b3f77\") " pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.359376 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mf628\" (UniqueName: \"kubernetes.io/projected/d221edd0-8dd5-41f9-b864-43f1f91b3f77-kube-api-access-mf628\") pod \"auto-csr-approver-29566642-cz2m9\" (UID: \"d221edd0-8dd5-41f9-b864-43f1f91b3f77\") " pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.484387 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:00 crc kubenswrapper[4958]: I0320 09:22:00.904850 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-cz2m9"] Mar 20 09:22:01 crc kubenswrapper[4958]: I0320 09:22:01.376319 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" event={"ID":"d221edd0-8dd5-41f9-b864-43f1f91b3f77","Type":"ContainerStarted","Data":"b3bab40364aa65d61492226c46b7f7a9b1ee2ab5a1c310a7b2ea99e43e55596c"} Mar 20 09:22:02 crc kubenswrapper[4958]: I0320 09:22:02.386138 4958 generic.go:334] "Generic (PLEG): container finished" podID="d221edd0-8dd5-41f9-b864-43f1f91b3f77" containerID="aa806e5f2cc2ac5ed0acb858464202e5ff0c3a417b9748361786bc84d4cf2ec5" exitCode=0 Mar 20 09:22:02 crc kubenswrapper[4958]: I0320 09:22:02.386226 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" event={"ID":"d221edd0-8dd5-41f9-b864-43f1f91b3f77","Type":"ContainerDied","Data":"aa806e5f2cc2ac5ed0acb858464202e5ff0c3a417b9748361786bc84d4cf2ec5"} Mar 20 09:22:03 crc kubenswrapper[4958]: I0320 09:22:03.690707 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:03 crc kubenswrapper[4958]: I0320 09:22:03.800213 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mf628\" (UniqueName: \"kubernetes.io/projected/d221edd0-8dd5-41f9-b864-43f1f91b3f77-kube-api-access-mf628\") pod \"d221edd0-8dd5-41f9-b864-43f1f91b3f77\" (UID: \"d221edd0-8dd5-41f9-b864-43f1f91b3f77\") " Mar 20 09:22:03 crc kubenswrapper[4958]: I0320 09:22:03.807945 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d221edd0-8dd5-41f9-b864-43f1f91b3f77-kube-api-access-mf628" (OuterVolumeSpecName: "kube-api-access-mf628") pod "d221edd0-8dd5-41f9-b864-43f1f91b3f77" (UID: "d221edd0-8dd5-41f9-b864-43f1f91b3f77"). InnerVolumeSpecName "kube-api-access-mf628". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:22:03 crc kubenswrapper[4958]: I0320 09:22:03.902208 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mf628\" (UniqueName: \"kubernetes.io/projected/d221edd0-8dd5-41f9-b864-43f1f91b3f77-kube-api-access-mf628\") on node \"crc\" DevicePath \"\"" Mar 20 09:22:04 crc kubenswrapper[4958]: I0320 09:22:04.417035 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" event={"ID":"d221edd0-8dd5-41f9-b864-43f1f91b3f77","Type":"ContainerDied","Data":"b3bab40364aa65d61492226c46b7f7a9b1ee2ab5a1c310a7b2ea99e43e55596c"} Mar 20 09:22:04 crc kubenswrapper[4958]: I0320 09:22:04.417103 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3bab40364aa65d61492226c46b7f7a9b1ee2ab5a1c310a7b2ea99e43e55596c" Mar 20 09:22:04 crc kubenswrapper[4958]: I0320 09:22:04.417149 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566642-cz2m9" Mar 20 09:22:04 crc kubenswrapper[4958]: I0320 09:22:04.762901 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-54tzd"] Mar 20 09:22:04 crc kubenswrapper[4958]: I0320 09:22:04.767826 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566636-54tzd"] Mar 20 09:22:06 crc kubenswrapper[4958]: I0320 09:22:06.444333 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="add4ecff-63cc-486a-90ed-3e61f3c143ba" path="/var/lib/kubelet/pods/add4ecff-63cc-486a-90ed-3e61f3c143ba/volumes" Mar 20 09:22:26 crc kubenswrapper[4958]: I0320 09:22:26.521490 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:22:26 crc kubenswrapper[4958]: I0320 09:22:26.523295 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:22:43 crc kubenswrapper[4958]: I0320 09:22:43.473267 4958 scope.go:117] "RemoveContainer" containerID="c263df7d94f23aa7486f8436bc1f644a5d1243f92e23f1d903429f26741ef1d6" Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.521867 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.523329 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.523402 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.524910 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6968d552cba3a45e5d78b3f461ade07ff40ad061c61889b404b41479bd961744"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.525883 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://6968d552cba3a45e5d78b3f461ade07ff40ad061c61889b404b41479bd961744" gracePeriod=600 Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.860853 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="6968d552cba3a45e5d78b3f461ade07ff40ad061c61889b404b41479bd961744" exitCode=0 Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.860973 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"6968d552cba3a45e5d78b3f461ade07ff40ad061c61889b404b41479bd961744"} Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.862182 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43"} Mar 20 09:22:56 crc kubenswrapper[4958]: I0320 09:22:56.862213 4958 scope.go:117] "RemoveContainer" containerID="007b6668849ff989fcaab0fedbd591707a471a4800519c18e47480ba1f688088" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.144793 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566644-gqcl5"] Mar 20 09:24:00 crc kubenswrapper[4958]: E0320 09:24:00.146245 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d221edd0-8dd5-41f9-b864-43f1f91b3f77" containerName="oc" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.146270 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d221edd0-8dd5-41f9-b864-43f1f91b3f77" containerName="oc" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.146476 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d221edd0-8dd5-41f9-b864-43f1f91b3f77" containerName="oc" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.147227 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.154691 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-gqcl5"] Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.157772 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.158131 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.158738 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.215202 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-th77n\" (UniqueName: \"kubernetes.io/projected/086880e3-27d5-49eb-ad88-5efb0da29e01-kube-api-access-th77n\") pod \"auto-csr-approver-29566644-gqcl5\" (UID: \"086880e3-27d5-49eb-ad88-5efb0da29e01\") " pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.316481 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-th77n\" (UniqueName: \"kubernetes.io/projected/086880e3-27d5-49eb-ad88-5efb0da29e01-kube-api-access-th77n\") pod \"auto-csr-approver-29566644-gqcl5\" (UID: \"086880e3-27d5-49eb-ad88-5efb0da29e01\") " pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.340096 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-th77n\" (UniqueName: \"kubernetes.io/projected/086880e3-27d5-49eb-ad88-5efb0da29e01-kube-api-access-th77n\") pod \"auto-csr-approver-29566644-gqcl5\" (UID: \"086880e3-27d5-49eb-ad88-5efb0da29e01\") " pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.472077 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.938951 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-gqcl5"] Mar 20 09:24:00 crc kubenswrapper[4958]: I0320 09:24:00.946299 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:24:01 crc kubenswrapper[4958]: I0320 09:24:01.405094 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" event={"ID":"086880e3-27d5-49eb-ad88-5efb0da29e01","Type":"ContainerStarted","Data":"78b4b18ca64e695ab268230aaea0051665b8c6f3b4f2e2a2e0635aa70ca95a4e"} Mar 20 09:24:02 crc kubenswrapper[4958]: I0320 09:24:02.414180 4958 generic.go:334] "Generic (PLEG): container finished" podID="086880e3-27d5-49eb-ad88-5efb0da29e01" containerID="2a21c8f81a038404043c600313d630aa26acaba751a3a777a393455846d28ced" exitCode=0 Mar 20 09:24:02 crc kubenswrapper[4958]: I0320 09:24:02.414255 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" event={"ID":"086880e3-27d5-49eb-ad88-5efb0da29e01","Type":"ContainerDied","Data":"2a21c8f81a038404043c600313d630aa26acaba751a3a777a393455846d28ced"} Mar 20 09:24:03 crc kubenswrapper[4958]: I0320 09:24:03.730074 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:03 crc kubenswrapper[4958]: I0320 09:24:03.776801 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-th77n\" (UniqueName: \"kubernetes.io/projected/086880e3-27d5-49eb-ad88-5efb0da29e01-kube-api-access-th77n\") pod \"086880e3-27d5-49eb-ad88-5efb0da29e01\" (UID: \"086880e3-27d5-49eb-ad88-5efb0da29e01\") " Mar 20 09:24:03 crc kubenswrapper[4958]: I0320 09:24:03.783699 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/086880e3-27d5-49eb-ad88-5efb0da29e01-kube-api-access-th77n" (OuterVolumeSpecName: "kube-api-access-th77n") pod "086880e3-27d5-49eb-ad88-5efb0da29e01" (UID: "086880e3-27d5-49eb-ad88-5efb0da29e01"). InnerVolumeSpecName "kube-api-access-th77n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:24:03 crc kubenswrapper[4958]: I0320 09:24:03.879208 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-th77n\" (UniqueName: \"kubernetes.io/projected/086880e3-27d5-49eb-ad88-5efb0da29e01-kube-api-access-th77n\") on node \"crc\" DevicePath \"\"" Mar 20 09:24:04 crc kubenswrapper[4958]: I0320 09:24:04.433129 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" event={"ID":"086880e3-27d5-49eb-ad88-5efb0da29e01","Type":"ContainerDied","Data":"78b4b18ca64e695ab268230aaea0051665b8c6f3b4f2e2a2e0635aa70ca95a4e"} Mar 20 09:24:04 crc kubenswrapper[4958]: I0320 09:24:04.433183 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="78b4b18ca64e695ab268230aaea0051665b8c6f3b4f2e2a2e0635aa70ca95a4e" Mar 20 09:24:04 crc kubenswrapper[4958]: I0320 09:24:04.433248 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566644-gqcl5" Mar 20 09:24:04 crc kubenswrapper[4958]: I0320 09:24:04.801949 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-r6jc2"] Mar 20 09:24:04 crc kubenswrapper[4958]: I0320 09:24:04.808248 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566638-r6jc2"] Mar 20 09:24:06 crc kubenswrapper[4958]: I0320 09:24:06.451133 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="060f0faa-4ff1-4f25-9354-ee90f8f7ccbf" path="/var/lib/kubelet/pods/060f0faa-4ff1-4f25-9354-ee90f8f7ccbf/volumes" Mar 20 09:24:43 crc kubenswrapper[4958]: I0320 09:24:43.581671 4958 scope.go:117] "RemoveContainer" containerID="3a183d4183ed1edad4292f0f5e3e7bfbedb6cc7ca0d4c551346315f9da4daba0" Mar 20 09:24:56 crc kubenswrapper[4958]: I0320 09:24:56.521398 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:24:56 crc kubenswrapper[4958]: I0320 09:24:56.522281 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:25:26 crc kubenswrapper[4958]: I0320 09:25:26.521330 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:25:26 crc kubenswrapper[4958]: I0320 09:25:26.522047 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:25:56 crc kubenswrapper[4958]: I0320 09:25:56.521687 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:25:56 crc kubenswrapper[4958]: I0320 09:25:56.522721 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:25:56 crc kubenswrapper[4958]: I0320 09:25:56.522797 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:25:56 crc kubenswrapper[4958]: I0320 09:25:56.523790 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:25:56 crc kubenswrapper[4958]: I0320 09:25:56.523867 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" gracePeriod=600 Mar 20 09:25:56 crc kubenswrapper[4958]: E0320 09:25:56.669187 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:25:57 crc kubenswrapper[4958]: I0320 09:25:57.381370 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" exitCode=0 Mar 20 09:25:57 crc kubenswrapper[4958]: I0320 09:25:57.381426 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43"} Mar 20 09:25:57 crc kubenswrapper[4958]: I0320 09:25:57.381469 4958 scope.go:117] "RemoveContainer" containerID="6968d552cba3a45e5d78b3f461ade07ff40ad061c61889b404b41479bd961744" Mar 20 09:25:57 crc kubenswrapper[4958]: I0320 09:25:57.382188 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:25:57 crc kubenswrapper[4958]: E0320 09:25:57.382417 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.144540 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566646-vzdc5"] Mar 20 09:26:00 crc kubenswrapper[4958]: E0320 09:26:00.147351 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="086880e3-27d5-49eb-ad88-5efb0da29e01" containerName="oc" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.147384 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="086880e3-27d5-49eb-ad88-5efb0da29e01" containerName="oc" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.147699 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="086880e3-27d5-49eb-ad88-5efb0da29e01" containerName="oc" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.148511 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.151388 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.151784 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.152003 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.163576 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-vzdc5"] Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.209892 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c68vw\" (UniqueName: \"kubernetes.io/projected/20769995-eaa7-4d5c-9a17-9252d478f74f-kube-api-access-c68vw\") pod \"auto-csr-approver-29566646-vzdc5\" (UID: \"20769995-eaa7-4d5c-9a17-9252d478f74f\") " pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.310892 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c68vw\" (UniqueName: \"kubernetes.io/projected/20769995-eaa7-4d5c-9a17-9252d478f74f-kube-api-access-c68vw\") pod \"auto-csr-approver-29566646-vzdc5\" (UID: \"20769995-eaa7-4d5c-9a17-9252d478f74f\") " pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.339832 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c68vw\" (UniqueName: \"kubernetes.io/projected/20769995-eaa7-4d5c-9a17-9252d478f74f-kube-api-access-c68vw\") pod \"auto-csr-approver-29566646-vzdc5\" (UID: \"20769995-eaa7-4d5c-9a17-9252d478f74f\") " pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.468996 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:00 crc kubenswrapper[4958]: I0320 09:26:00.931195 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-vzdc5"] Mar 20 09:26:01 crc kubenswrapper[4958]: I0320 09:26:01.420371 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" event={"ID":"20769995-eaa7-4d5c-9a17-9252d478f74f","Type":"ContainerStarted","Data":"ca00017316ed7843d11cbafafa56b31797350edb5679ba6ccf50bee8d2ec9b13"} Mar 20 09:26:02 crc kubenswrapper[4958]: I0320 09:26:02.436850 4958 generic.go:334] "Generic (PLEG): container finished" podID="20769995-eaa7-4d5c-9a17-9252d478f74f" containerID="9eefae57cc37e912b2679aa07a2fdd91cd61c9953cc29284029e0c687ff915f3" exitCode=0 Mar 20 09:26:02 crc kubenswrapper[4958]: I0320 09:26:02.445338 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" event={"ID":"20769995-eaa7-4d5c-9a17-9252d478f74f","Type":"ContainerDied","Data":"9eefae57cc37e912b2679aa07a2fdd91cd61c9953cc29284029e0c687ff915f3"} Mar 20 09:26:03 crc kubenswrapper[4958]: I0320 09:26:03.724154 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:03 crc kubenswrapper[4958]: I0320 09:26:03.776465 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c68vw\" (UniqueName: \"kubernetes.io/projected/20769995-eaa7-4d5c-9a17-9252d478f74f-kube-api-access-c68vw\") pod \"20769995-eaa7-4d5c-9a17-9252d478f74f\" (UID: \"20769995-eaa7-4d5c-9a17-9252d478f74f\") " Mar 20 09:26:03 crc kubenswrapper[4958]: I0320 09:26:03.785003 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20769995-eaa7-4d5c-9a17-9252d478f74f-kube-api-access-c68vw" (OuterVolumeSpecName: "kube-api-access-c68vw") pod "20769995-eaa7-4d5c-9a17-9252d478f74f" (UID: "20769995-eaa7-4d5c-9a17-9252d478f74f"). InnerVolumeSpecName "kube-api-access-c68vw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:26:03 crc kubenswrapper[4958]: I0320 09:26:03.878432 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c68vw\" (UniqueName: \"kubernetes.io/projected/20769995-eaa7-4d5c-9a17-9252d478f74f-kube-api-access-c68vw\") on node \"crc\" DevicePath \"\"" Mar 20 09:26:04 crc kubenswrapper[4958]: I0320 09:26:04.454684 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" event={"ID":"20769995-eaa7-4d5c-9a17-9252d478f74f","Type":"ContainerDied","Data":"ca00017316ed7843d11cbafafa56b31797350edb5679ba6ccf50bee8d2ec9b13"} Mar 20 09:26:04 crc kubenswrapper[4958]: I0320 09:26:04.454740 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca00017316ed7843d11cbafafa56b31797350edb5679ba6ccf50bee8d2ec9b13" Mar 20 09:26:04 crc kubenswrapper[4958]: I0320 09:26:04.454757 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566646-vzdc5" Mar 20 09:26:04 crc kubenswrapper[4958]: I0320 09:26:04.809988 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-x9skg"] Mar 20 09:26:04 crc kubenswrapper[4958]: I0320 09:26:04.814976 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566640-x9skg"] Mar 20 09:26:06 crc kubenswrapper[4958]: I0320 09:26:06.447684 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79" path="/var/lib/kubelet/pods/a2dfdaa2-711d-4b33-a0e3-1f3e492e5f79/volumes" Mar 20 09:26:11 crc kubenswrapper[4958]: I0320 09:26:11.435454 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:26:11 crc kubenswrapper[4958]: E0320 09:26:11.437052 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:26:22 crc kubenswrapper[4958]: I0320 09:26:22.435087 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:26:22 crc kubenswrapper[4958]: E0320 09:26:22.436319 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:26:33 crc kubenswrapper[4958]: I0320 09:26:33.435036 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:26:33 crc kubenswrapper[4958]: E0320 09:26:33.436324 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:26:43 crc kubenswrapper[4958]: I0320 09:26:43.697086 4958 scope.go:117] "RemoveContainer" containerID="3d56647e17b75d6ca56242719838964b4d7336c7b3019bacf60c8a186696be0b" Mar 20 09:26:44 crc kubenswrapper[4958]: I0320 09:26:44.435251 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:26:44 crc kubenswrapper[4958]: E0320 09:26:44.435726 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.365317 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7dqv5"] Mar 20 09:26:45 crc kubenswrapper[4958]: E0320 09:26:45.365816 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20769995-eaa7-4d5c-9a17-9252d478f74f" containerName="oc" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.365831 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="20769995-eaa7-4d5c-9a17-9252d478f74f" containerName="oc" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.366027 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="20769995-eaa7-4d5c-9a17-9252d478f74f" containerName="oc" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.367379 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.381677 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dqv5"] Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.480387 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-utilities\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.480777 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-catalog-content\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.480884 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz6hn\" (UniqueName: \"kubernetes.io/projected/2036497e-4754-45d4-b8f3-8a5929614d58-kube-api-access-fz6hn\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.582819 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-catalog-content\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.582930 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz6hn\" (UniqueName: \"kubernetes.io/projected/2036497e-4754-45d4-b8f3-8a5929614d58-kube-api-access-fz6hn\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.582962 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-utilities\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.583636 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-utilities\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.583667 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-catalog-content\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.608189 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz6hn\" (UniqueName: \"kubernetes.io/projected/2036497e-4754-45d4-b8f3-8a5929614d58-kube-api-access-fz6hn\") pod \"certified-operators-7dqv5\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:45 crc kubenswrapper[4958]: I0320 09:26:45.696059 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:46 crc kubenswrapper[4958]: I0320 09:26:46.210004 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7dqv5"] Mar 20 09:26:46 crc kubenswrapper[4958]: I0320 09:26:46.815345 4958 generic.go:334] "Generic (PLEG): container finished" podID="2036497e-4754-45d4-b8f3-8a5929614d58" containerID="02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537" exitCode=0 Mar 20 09:26:46 crc kubenswrapper[4958]: I0320 09:26:46.815406 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dqv5" event={"ID":"2036497e-4754-45d4-b8f3-8a5929614d58","Type":"ContainerDied","Data":"02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537"} Mar 20 09:26:46 crc kubenswrapper[4958]: I0320 09:26:46.815440 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dqv5" event={"ID":"2036497e-4754-45d4-b8f3-8a5929614d58","Type":"ContainerStarted","Data":"277484f48cee9c2bf2f13c8a98310815b4e27062f3fa16d37146f4601276a990"} Mar 20 09:26:47 crc kubenswrapper[4958]: I0320 09:26:47.842388 4958 generic.go:334] "Generic (PLEG): container finished" podID="2036497e-4754-45d4-b8f3-8a5929614d58" containerID="087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc" exitCode=0 Mar 20 09:26:47 crc kubenswrapper[4958]: I0320 09:26:47.842508 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dqv5" event={"ID":"2036497e-4754-45d4-b8f3-8a5929614d58","Type":"ContainerDied","Data":"087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc"} Mar 20 09:26:48 crc kubenswrapper[4958]: I0320 09:26:48.853557 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dqv5" event={"ID":"2036497e-4754-45d4-b8f3-8a5929614d58","Type":"ContainerStarted","Data":"558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d"} Mar 20 09:26:48 crc kubenswrapper[4958]: I0320 09:26:48.883453 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7dqv5" podStartSLOduration=2.456476818 podStartE2EDuration="3.883428904s" podCreationTimestamp="2026-03-20 09:26:45 +0000 UTC" firstStartedPulling="2026-03-20 09:26:46.817359227 +0000 UTC m=+1627.139375185" lastFinishedPulling="2026-03-20 09:26:48.244311313 +0000 UTC m=+1628.566327271" observedRunningTime="2026-03-20 09:26:48.882138599 +0000 UTC m=+1629.204154597" watchObservedRunningTime="2026-03-20 09:26:48.883428904 +0000 UTC m=+1629.205444862" Mar 20 09:26:55 crc kubenswrapper[4958]: I0320 09:26:55.696681 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:55 crc kubenswrapper[4958]: I0320 09:26:55.697536 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:55 crc kubenswrapper[4958]: I0320 09:26:55.740268 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:55 crc kubenswrapper[4958]: I0320 09:26:55.947059 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:55 crc kubenswrapper[4958]: I0320 09:26:55.998664 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dqv5"] Mar 20 09:26:57 crc kubenswrapper[4958]: I0320 09:26:57.923733 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7dqv5" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="registry-server" containerID="cri-o://558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d" gracePeriod=2 Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.347579 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.397333 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-utilities\") pod \"2036497e-4754-45d4-b8f3-8a5929614d58\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.397389 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz6hn\" (UniqueName: \"kubernetes.io/projected/2036497e-4754-45d4-b8f3-8a5929614d58-kube-api-access-fz6hn\") pod \"2036497e-4754-45d4-b8f3-8a5929614d58\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.397456 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-catalog-content\") pod \"2036497e-4754-45d4-b8f3-8a5929614d58\" (UID: \"2036497e-4754-45d4-b8f3-8a5929614d58\") " Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.398165 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-utilities" (OuterVolumeSpecName: "utilities") pod "2036497e-4754-45d4-b8f3-8a5929614d58" (UID: "2036497e-4754-45d4-b8f3-8a5929614d58"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.405476 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2036497e-4754-45d4-b8f3-8a5929614d58-kube-api-access-fz6hn" (OuterVolumeSpecName: "kube-api-access-fz6hn") pod "2036497e-4754-45d4-b8f3-8a5929614d58" (UID: "2036497e-4754-45d4-b8f3-8a5929614d58"). InnerVolumeSpecName "kube-api-access-fz6hn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.434812 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:26:58 crc kubenswrapper[4958]: E0320 09:26:58.435190 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.499816 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.499863 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz6hn\" (UniqueName: \"kubernetes.io/projected/2036497e-4754-45d4-b8f3-8a5929614d58-kube-api-access-fz6hn\") on node \"crc\" DevicePath \"\"" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.933017 4958 generic.go:334] "Generic (PLEG): container finished" podID="2036497e-4754-45d4-b8f3-8a5929614d58" containerID="558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d" exitCode=0 Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.933087 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dqv5" event={"ID":"2036497e-4754-45d4-b8f3-8a5929614d58","Type":"ContainerDied","Data":"558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d"} Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.933136 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7dqv5" event={"ID":"2036497e-4754-45d4-b8f3-8a5929614d58","Type":"ContainerDied","Data":"277484f48cee9c2bf2f13c8a98310815b4e27062f3fa16d37146f4601276a990"} Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.933131 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7dqv5" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.933188 4958 scope.go:117] "RemoveContainer" containerID="558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d" Mar 20 09:26:58 crc kubenswrapper[4958]: I0320 09:26:58.966069 4958 scope.go:117] "RemoveContainer" containerID="087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.002030 4958 scope.go:117] "RemoveContainer" containerID="02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.040064 4958 scope.go:117] "RemoveContainer" containerID="558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d" Mar 20 09:26:59 crc kubenswrapper[4958]: E0320 09:26:59.040853 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d\": container with ID starting with 558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d not found: ID does not exist" containerID="558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.040894 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d"} err="failed to get container status \"558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d\": rpc error: code = NotFound desc = could not find container \"558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d\": container with ID starting with 558aef32790b99cde980e86e7f9843d0b42bd2caec58c0d5988f16b057a04d4d not found: ID does not exist" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.040944 4958 scope.go:117] "RemoveContainer" containerID="087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc" Mar 20 09:26:59 crc kubenswrapper[4958]: E0320 09:26:59.041528 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc\": container with ID starting with 087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc not found: ID does not exist" containerID="087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.041563 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc"} err="failed to get container status \"087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc\": rpc error: code = NotFound desc = could not find container \"087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc\": container with ID starting with 087edac10989bb2f9e27e76a7c894f30e42b499cb202e1737d80bfba47f340fc not found: ID does not exist" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.041579 4958 scope.go:117] "RemoveContainer" containerID="02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537" Mar 20 09:26:59 crc kubenswrapper[4958]: E0320 09:26:59.042007 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537\": container with ID starting with 02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537 not found: ID does not exist" containerID="02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.042074 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537"} err="failed to get container status \"02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537\": rpc error: code = NotFound desc = could not find container \"02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537\": container with ID starting with 02de754843b62b8895f2db2ed3169a12e1c23b38aa059706a3b5e65319499537 not found: ID does not exist" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.136758 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2036497e-4754-45d4-b8f3-8a5929614d58" (UID: "2036497e-4754-45d4-b8f3-8a5929614d58"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.211412 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2036497e-4754-45d4-b8f3-8a5929614d58-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.271178 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7dqv5"] Mar 20 09:26:59 crc kubenswrapper[4958]: I0320 09:26:59.276348 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7dqv5"] Mar 20 09:27:00 crc kubenswrapper[4958]: I0320 09:27:00.446117 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" path="/var/lib/kubelet/pods/2036497e-4754-45d4-b8f3-8a5929614d58/volumes" Mar 20 09:27:13 crc kubenswrapper[4958]: I0320 09:27:13.436448 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:27:13 crc kubenswrapper[4958]: E0320 09:27:13.439069 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:27:28 crc kubenswrapper[4958]: I0320 09:27:28.435223 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:27:28 crc kubenswrapper[4958]: E0320 09:27:28.436402 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:27:41 crc kubenswrapper[4958]: I0320 09:27:41.435360 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:27:41 crc kubenswrapper[4958]: E0320 09:27:41.436564 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:27:54 crc kubenswrapper[4958]: I0320 09:27:54.435250 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:27:54 crc kubenswrapper[4958]: E0320 09:27:54.436111 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.147883 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566648-fv945"] Mar 20 09:28:00 crc kubenswrapper[4958]: E0320 09:28:00.148637 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="extract-utilities" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.148658 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="extract-utilities" Mar 20 09:28:00 crc kubenswrapper[4958]: E0320 09:28:00.148676 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="extract-content" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.148684 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="extract-content" Mar 20 09:28:00 crc kubenswrapper[4958]: E0320 09:28:00.148698 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="registry-server" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.148707 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="registry-server" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.148919 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2036497e-4754-45d4-b8f3-8a5929614d58" containerName="registry-server" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.149515 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.157032 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.157432 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.158272 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.162038 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-fv945"] Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.227824 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt2gm\" (UniqueName: \"kubernetes.io/projected/2f426c3a-7eb6-49eb-ba58-6b83b6b67167-kube-api-access-kt2gm\") pod \"auto-csr-approver-29566648-fv945\" (UID: \"2f426c3a-7eb6-49eb-ba58-6b83b6b67167\") " pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.328728 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kt2gm\" (UniqueName: \"kubernetes.io/projected/2f426c3a-7eb6-49eb-ba58-6b83b6b67167-kube-api-access-kt2gm\") pod \"auto-csr-approver-29566648-fv945\" (UID: \"2f426c3a-7eb6-49eb-ba58-6b83b6b67167\") " pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.349830 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt2gm\" (UniqueName: \"kubernetes.io/projected/2f426c3a-7eb6-49eb-ba58-6b83b6b67167-kube-api-access-kt2gm\") pod \"auto-csr-approver-29566648-fv945\" (UID: \"2f426c3a-7eb6-49eb-ba58-6b83b6b67167\") " pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.475670 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:00 crc kubenswrapper[4958]: I0320 09:28:00.903497 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-fv945"] Mar 20 09:28:01 crc kubenswrapper[4958]: I0320 09:28:01.463669 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566648-fv945" event={"ID":"2f426c3a-7eb6-49eb-ba58-6b83b6b67167","Type":"ContainerStarted","Data":"be0396a2682dfae9de330636adfa7ae8aeb6ea7f36b33c3793b2834261b86992"} Mar 20 09:28:03 crc kubenswrapper[4958]: I0320 09:28:03.479929 4958 generic.go:334] "Generic (PLEG): container finished" podID="2f426c3a-7eb6-49eb-ba58-6b83b6b67167" containerID="c3bbebb08e3de0dce8534c1086e4e53b98b155990ee5442ceb2bc2360863f0f9" exitCode=0 Mar 20 09:28:03 crc kubenswrapper[4958]: I0320 09:28:03.480210 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566648-fv945" event={"ID":"2f426c3a-7eb6-49eb-ba58-6b83b6b67167","Type":"ContainerDied","Data":"c3bbebb08e3de0dce8534c1086e4e53b98b155990ee5442ceb2bc2360863f0f9"} Mar 20 09:28:04 crc kubenswrapper[4958]: I0320 09:28:04.766836 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:04 crc kubenswrapper[4958]: I0320 09:28:04.899525 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kt2gm\" (UniqueName: \"kubernetes.io/projected/2f426c3a-7eb6-49eb-ba58-6b83b6b67167-kube-api-access-kt2gm\") pod \"2f426c3a-7eb6-49eb-ba58-6b83b6b67167\" (UID: \"2f426c3a-7eb6-49eb-ba58-6b83b6b67167\") " Mar 20 09:28:04 crc kubenswrapper[4958]: I0320 09:28:04.904708 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2f426c3a-7eb6-49eb-ba58-6b83b6b67167-kube-api-access-kt2gm" (OuterVolumeSpecName: "kube-api-access-kt2gm") pod "2f426c3a-7eb6-49eb-ba58-6b83b6b67167" (UID: "2f426c3a-7eb6-49eb-ba58-6b83b6b67167"). InnerVolumeSpecName "kube-api-access-kt2gm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:28:05 crc kubenswrapper[4958]: I0320 09:28:05.001677 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kt2gm\" (UniqueName: \"kubernetes.io/projected/2f426c3a-7eb6-49eb-ba58-6b83b6b67167-kube-api-access-kt2gm\") on node \"crc\" DevicePath \"\"" Mar 20 09:28:05 crc kubenswrapper[4958]: I0320 09:28:05.495993 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566648-fv945" event={"ID":"2f426c3a-7eb6-49eb-ba58-6b83b6b67167","Type":"ContainerDied","Data":"be0396a2682dfae9de330636adfa7ae8aeb6ea7f36b33c3793b2834261b86992"} Mar 20 09:28:05 crc kubenswrapper[4958]: I0320 09:28:05.496040 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="be0396a2682dfae9de330636adfa7ae8aeb6ea7f36b33c3793b2834261b86992" Mar 20 09:28:05 crc kubenswrapper[4958]: I0320 09:28:05.496073 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566648-fv945" Mar 20 09:28:05 crc kubenswrapper[4958]: I0320 09:28:05.834204 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-cz2m9"] Mar 20 09:28:05 crc kubenswrapper[4958]: I0320 09:28:05.840702 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566642-cz2m9"] Mar 20 09:28:06 crc kubenswrapper[4958]: I0320 09:28:06.443591 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d221edd0-8dd5-41f9-b864-43f1f91b3f77" path="/var/lib/kubelet/pods/d221edd0-8dd5-41f9-b864-43f1f91b3f77/volumes" Mar 20 09:28:08 crc kubenswrapper[4958]: I0320 09:28:08.434580 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:28:08 crc kubenswrapper[4958]: E0320 09:28:08.434886 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:28:21 crc kubenswrapper[4958]: I0320 09:28:21.435010 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:28:21 crc kubenswrapper[4958]: E0320 09:28:21.436142 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:28:32 crc kubenswrapper[4958]: I0320 09:28:32.435780 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:28:32 crc kubenswrapper[4958]: E0320 09:28:32.438625 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:28:43 crc kubenswrapper[4958]: I0320 09:28:43.435797 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:28:43 crc kubenswrapper[4958]: E0320 09:28:43.436939 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:28:43 crc kubenswrapper[4958]: I0320 09:28:43.822344 4958 scope.go:117] "RemoveContainer" containerID="aa806e5f2cc2ac5ed0acb858464202e5ff0c3a417b9748361786bc84d4cf2ec5" Mar 20 09:28:56 crc kubenswrapper[4958]: I0320 09:28:56.435666 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:28:56 crc kubenswrapper[4958]: E0320 09:28:56.436951 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:29:10 crc kubenswrapper[4958]: I0320 09:29:10.441937 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:29:10 crc kubenswrapper[4958]: E0320 09:29:10.448548 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:29:23 crc kubenswrapper[4958]: I0320 09:29:23.435167 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:29:23 crc kubenswrapper[4958]: E0320 09:29:23.435712 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:29:38 crc kubenswrapper[4958]: I0320 09:29:38.435233 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:29:38 crc kubenswrapper[4958]: E0320 09:29:38.436204 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:29:52 crc kubenswrapper[4958]: I0320 09:29:52.434611 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:29:52 crc kubenswrapper[4958]: E0320 09:29:52.435420 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.150861 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566650-9srq9"] Mar 20 09:30:00 crc kubenswrapper[4958]: E0320 09:30:00.154317 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2f426c3a-7eb6-49eb-ba58-6b83b6b67167" containerName="oc" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.154371 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2f426c3a-7eb6-49eb-ba58-6b83b6b67167" containerName="oc" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.154567 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2f426c3a-7eb6-49eb-ba58-6b83b6b67167" containerName="oc" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.155295 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.156204 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss"] Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.157478 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.157757 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.159573 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.162578 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.162619 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.164589 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-9srq9"] Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.164626 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.191345 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss"] Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.205073 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-secret-volume\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.205143 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-config-volume\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.205184 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fz6wb\" (UniqueName: \"kubernetes.io/projected/4b41a815-9af6-4747-a1b8-69b98ec2dafe-kube-api-access-fz6wb\") pod \"auto-csr-approver-29566650-9srq9\" (UID: \"4b41a815-9af6-4747-a1b8-69b98ec2dafe\") " pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.205250 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz6rc\" (UniqueName: \"kubernetes.io/projected/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-kube-api-access-mz6rc\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.306092 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-secret-volume\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.306153 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-config-volume\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.306188 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fz6wb\" (UniqueName: \"kubernetes.io/projected/4b41a815-9af6-4747-a1b8-69b98ec2dafe-kube-api-access-fz6wb\") pod \"auto-csr-approver-29566650-9srq9\" (UID: \"4b41a815-9af6-4747-a1b8-69b98ec2dafe\") " pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.306259 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mz6rc\" (UniqueName: \"kubernetes.io/projected/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-kube-api-access-mz6rc\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.307549 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-config-volume\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.313331 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-secret-volume\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.325328 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz6rc\" (UniqueName: \"kubernetes.io/projected/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-kube-api-access-mz6rc\") pod \"collect-profiles-29566650-2s2ss\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.325803 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fz6wb\" (UniqueName: \"kubernetes.io/projected/4b41a815-9af6-4747-a1b8-69b98ec2dafe-kube-api-access-fz6wb\") pod \"auto-csr-approver-29566650-9srq9\" (UID: \"4b41a815-9af6-4747-a1b8-69b98ec2dafe\") " pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.483833 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.494865 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.964330 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-9srq9"] Mar 20 09:30:00 crc kubenswrapper[4958]: I0320 09:30:00.973161 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:30:01 crc kubenswrapper[4958]: I0320 09:30:01.018127 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss"] Mar 20 09:30:01 crc kubenswrapper[4958]: W0320 09:30:01.019270 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd44cc46_bec2_44d3_9571_0d3d9dcacc21.slice/crio-28f02c36b0a9d251419ddfd190bfa0641e2da63996dbf7a669d840b993344c94 WatchSource:0}: Error finding container 28f02c36b0a9d251419ddfd190bfa0641e2da63996dbf7a669d840b993344c94: Status 404 returned error can't find the container with id 28f02c36b0a9d251419ddfd190bfa0641e2da63996dbf7a669d840b993344c94 Mar 20 09:30:01 crc kubenswrapper[4958]: I0320 09:30:01.456838 4958 generic.go:334] "Generic (PLEG): container finished" podID="dd44cc46-bec2-44d3-9571-0d3d9dcacc21" containerID="a1fe5e216feae9ab64655398de5c1818071f35951756d4dddefa58e6ecf9ebed" exitCode=0 Mar 20 09:30:01 crc kubenswrapper[4958]: I0320 09:30:01.457055 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" event={"ID":"dd44cc46-bec2-44d3-9571-0d3d9dcacc21","Type":"ContainerDied","Data":"a1fe5e216feae9ab64655398de5c1818071f35951756d4dddefa58e6ecf9ebed"} Mar 20 09:30:01 crc kubenswrapper[4958]: I0320 09:30:01.457367 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" event={"ID":"dd44cc46-bec2-44d3-9571-0d3d9dcacc21","Type":"ContainerStarted","Data":"28f02c36b0a9d251419ddfd190bfa0641e2da63996dbf7a669d840b993344c94"} Mar 20 09:30:01 crc kubenswrapper[4958]: I0320 09:30:01.459126 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-9srq9" event={"ID":"4b41a815-9af6-4747-a1b8-69b98ec2dafe","Type":"ContainerStarted","Data":"302ca46b76270e5b971e001a4302b5342870ac6d1cf36c6dc65455cdae874def"} Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.747786 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.851078 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-config-volume\") pod \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.851294 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-secret-volume\") pod \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.851361 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz6rc\" (UniqueName: \"kubernetes.io/projected/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-kube-api-access-mz6rc\") pod \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\" (UID: \"dd44cc46-bec2-44d3-9571-0d3d9dcacc21\") " Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.852669 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-config-volume" (OuterVolumeSpecName: "config-volume") pod "dd44cc46-bec2-44d3-9571-0d3d9dcacc21" (UID: "dd44cc46-bec2-44d3-9571-0d3d9dcacc21"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.858232 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-kube-api-access-mz6rc" (OuterVolumeSpecName: "kube-api-access-mz6rc") pod "dd44cc46-bec2-44d3-9571-0d3d9dcacc21" (UID: "dd44cc46-bec2-44d3-9571-0d3d9dcacc21"). InnerVolumeSpecName "kube-api-access-mz6rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.858264 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dd44cc46-bec2-44d3-9571-0d3d9dcacc21" (UID: "dd44cc46-bec2-44d3-9571-0d3d9dcacc21"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.952913 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.953460 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mz6rc\" (UniqueName: \"kubernetes.io/projected/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-kube-api-access-mz6rc\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:02 crc kubenswrapper[4958]: I0320 09:30:02.953477 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dd44cc46-bec2-44d3-9571-0d3d9dcacc21-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:03 crc kubenswrapper[4958]: I0320 09:30:03.485847 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" Mar 20 09:30:03 crc kubenswrapper[4958]: I0320 09:30:03.485858 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566650-2s2ss" event={"ID":"dd44cc46-bec2-44d3-9571-0d3d9dcacc21","Type":"ContainerDied","Data":"28f02c36b0a9d251419ddfd190bfa0641e2da63996dbf7a669d840b993344c94"} Mar 20 09:30:03 crc kubenswrapper[4958]: I0320 09:30:03.485978 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28f02c36b0a9d251419ddfd190bfa0641e2da63996dbf7a669d840b993344c94" Mar 20 09:30:03 crc kubenswrapper[4958]: I0320 09:30:03.490358 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-9srq9" event={"ID":"4b41a815-9af6-4747-a1b8-69b98ec2dafe","Type":"ContainerStarted","Data":"b8bc08d93e928b428d371eebe33224513a75314307bd2853ba583531a7a95bc7"} Mar 20 09:30:04 crc kubenswrapper[4958]: I0320 09:30:04.502263 4958 generic.go:334] "Generic (PLEG): container finished" podID="4b41a815-9af6-4747-a1b8-69b98ec2dafe" containerID="b8bc08d93e928b428d371eebe33224513a75314307bd2853ba583531a7a95bc7" exitCode=0 Mar 20 09:30:04 crc kubenswrapper[4958]: I0320 09:30:04.502411 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-9srq9" event={"ID":"4b41a815-9af6-4747-a1b8-69b98ec2dafe","Type":"ContainerDied","Data":"b8bc08d93e928b428d371eebe33224513a75314307bd2853ba583531a7a95bc7"} Mar 20 09:30:04 crc kubenswrapper[4958]: I0320 09:30:04.803209 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:04 crc kubenswrapper[4958]: I0320 09:30:04.984402 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fz6wb\" (UniqueName: \"kubernetes.io/projected/4b41a815-9af6-4747-a1b8-69b98ec2dafe-kube-api-access-fz6wb\") pod \"4b41a815-9af6-4747-a1b8-69b98ec2dafe\" (UID: \"4b41a815-9af6-4747-a1b8-69b98ec2dafe\") " Mar 20 09:30:04 crc kubenswrapper[4958]: I0320 09:30:04.992566 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b41a815-9af6-4747-a1b8-69b98ec2dafe-kube-api-access-fz6wb" (OuterVolumeSpecName: "kube-api-access-fz6wb") pod "4b41a815-9af6-4747-a1b8-69b98ec2dafe" (UID: "4b41a815-9af6-4747-a1b8-69b98ec2dafe"). InnerVolumeSpecName "kube-api-access-fz6wb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:30:05 crc kubenswrapper[4958]: I0320 09:30:05.086286 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fz6wb\" (UniqueName: \"kubernetes.io/projected/4b41a815-9af6-4747-a1b8-69b98ec2dafe-kube-api-access-fz6wb\") on node \"crc\" DevicePath \"\"" Mar 20 09:30:05 crc kubenswrapper[4958]: I0320 09:30:05.513469 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566650-9srq9" event={"ID":"4b41a815-9af6-4747-a1b8-69b98ec2dafe","Type":"ContainerDied","Data":"302ca46b76270e5b971e001a4302b5342870ac6d1cf36c6dc65455cdae874def"} Mar 20 09:30:05 crc kubenswrapper[4958]: I0320 09:30:05.513992 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="302ca46b76270e5b971e001a4302b5342870ac6d1cf36c6dc65455cdae874def" Mar 20 09:30:05 crc kubenswrapper[4958]: I0320 09:30:05.513526 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566650-9srq9" Mar 20 09:30:05 crc kubenswrapper[4958]: I0320 09:30:05.878988 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-gqcl5"] Mar 20 09:30:05 crc kubenswrapper[4958]: I0320 09:30:05.884881 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566644-gqcl5"] Mar 20 09:30:06 crc kubenswrapper[4958]: I0320 09:30:06.435463 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:30:06 crc kubenswrapper[4958]: E0320 09:30:06.435740 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:30:06 crc kubenswrapper[4958]: I0320 09:30:06.446880 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="086880e3-27d5-49eb-ad88-5efb0da29e01" path="/var/lib/kubelet/pods/086880e3-27d5-49eb-ad88-5efb0da29e01/volumes" Mar 20 09:30:18 crc kubenswrapper[4958]: I0320 09:30:18.434978 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:30:18 crc kubenswrapper[4958]: E0320 09:30:18.435750 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:30:31 crc kubenswrapper[4958]: I0320 09:30:31.436091 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:30:31 crc kubenswrapper[4958]: E0320 09:30:31.439127 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:30:43 crc kubenswrapper[4958]: I0320 09:30:43.927397 4958 scope.go:117] "RemoveContainer" containerID="2a21c8f81a038404043c600313d630aa26acaba751a3a777a393455846d28ced" Mar 20 09:30:46 crc kubenswrapper[4958]: I0320 09:30:46.435016 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:30:46 crc kubenswrapper[4958]: E0320 09:30:46.435782 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:30:58 crc kubenswrapper[4958]: I0320 09:30:58.435374 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:30:58 crc kubenswrapper[4958]: I0320 09:30:58.955391 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"ee5c926d8da62a10bc26c348340bfe357e037f11fb90ffa62417af57f07e12c2"} Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.150152 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566652-jxvh6"] Mar 20 09:32:00 crc kubenswrapper[4958]: E0320 09:32:00.151808 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4b41a815-9af6-4747-a1b8-69b98ec2dafe" containerName="oc" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.151831 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b41a815-9af6-4747-a1b8-69b98ec2dafe" containerName="oc" Mar 20 09:32:00 crc kubenswrapper[4958]: E0320 09:32:00.151852 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dd44cc46-bec2-44d3-9571-0d3d9dcacc21" containerName="collect-profiles" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.151860 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd44cc46-bec2-44d3-9571-0d3d9dcacc21" containerName="collect-profiles" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.152092 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="dd44cc46-bec2-44d3-9571-0d3d9dcacc21" containerName="collect-profiles" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.152113 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4b41a815-9af6-4747-a1b8-69b98ec2dafe" containerName="oc" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.153025 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.156737 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-jxvh6"] Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.161287 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.161500 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.161541 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.263948 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djghh\" (UniqueName: \"kubernetes.io/projected/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01-kube-api-access-djghh\") pod \"auto-csr-approver-29566652-jxvh6\" (UID: \"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01\") " pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.366468 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djghh\" (UniqueName: \"kubernetes.io/projected/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01-kube-api-access-djghh\") pod \"auto-csr-approver-29566652-jxvh6\" (UID: \"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01\") " pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.389343 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djghh\" (UniqueName: \"kubernetes.io/projected/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01-kube-api-access-djghh\") pod \"auto-csr-approver-29566652-jxvh6\" (UID: \"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01\") " pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.472395 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:00 crc kubenswrapper[4958]: I0320 09:32:00.700412 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-jxvh6"] Mar 20 09:32:01 crc kubenswrapper[4958]: I0320 09:32:01.484115 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" event={"ID":"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01","Type":"ContainerStarted","Data":"f8fc8f9b6aa70d7458e56508ffbb3b1528a530340c9617c65d6e1ebcbd8c3de4"} Mar 20 09:32:02 crc kubenswrapper[4958]: I0320 09:32:02.496149 4958 generic.go:334] "Generic (PLEG): container finished" podID="8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01" containerID="97a34dfd347343cb9968b7afe41cb78a310e74eff4f782e7211de89cb3ae31e1" exitCode=0 Mar 20 09:32:02 crc kubenswrapper[4958]: I0320 09:32:02.496233 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" event={"ID":"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01","Type":"ContainerDied","Data":"97a34dfd347343cb9968b7afe41cb78a310e74eff4f782e7211de89cb3ae31e1"} Mar 20 09:32:03 crc kubenswrapper[4958]: I0320 09:32:03.809813 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:03 crc kubenswrapper[4958]: I0320 09:32:03.936549 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-djghh\" (UniqueName: \"kubernetes.io/projected/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01-kube-api-access-djghh\") pod \"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01\" (UID: \"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01\") " Mar 20 09:32:03 crc kubenswrapper[4958]: I0320 09:32:03.946540 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01-kube-api-access-djghh" (OuterVolumeSpecName: "kube-api-access-djghh") pod "8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01" (UID: "8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01"). InnerVolumeSpecName "kube-api-access-djghh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:32:04 crc kubenswrapper[4958]: I0320 09:32:04.039086 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-djghh\" (UniqueName: \"kubernetes.io/projected/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01-kube-api-access-djghh\") on node \"crc\" DevicePath \"\"" Mar 20 09:32:04 crc kubenswrapper[4958]: I0320 09:32:04.516907 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" event={"ID":"8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01","Type":"ContainerDied","Data":"f8fc8f9b6aa70d7458e56508ffbb3b1528a530340c9617c65d6e1ebcbd8c3de4"} Mar 20 09:32:04 crc kubenswrapper[4958]: I0320 09:32:04.517282 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f8fc8f9b6aa70d7458e56508ffbb3b1528a530340c9617c65d6e1ebcbd8c3de4" Mar 20 09:32:04 crc kubenswrapper[4958]: I0320 09:32:04.516981 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566652-jxvh6" Mar 20 09:32:04 crc kubenswrapper[4958]: I0320 09:32:04.892922 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-vzdc5"] Mar 20 09:32:04 crc kubenswrapper[4958]: I0320 09:32:04.898234 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566646-vzdc5"] Mar 20 09:32:06 crc kubenswrapper[4958]: I0320 09:32:06.448875 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20769995-eaa7-4d5c-9a17-9252d478f74f" path="/var/lib/kubelet/pods/20769995-eaa7-4d5c-9a17-9252d478f74f/volumes" Mar 20 09:32:44 crc kubenswrapper[4958]: I0320 09:32:44.018515 4958 scope.go:117] "RemoveContainer" containerID="9eefae57cc37e912b2679aa07a2fdd91cd61c9953cc29284029e0c687ff915f3" Mar 20 09:33:26 crc kubenswrapper[4958]: I0320 09:33:26.521688 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:33:26 crc kubenswrapper[4958]: I0320 09:33:26.522347 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:33:56 crc kubenswrapper[4958]: I0320 09:33:56.521364 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:33:56 crc kubenswrapper[4958]: I0320 09:33:56.522396 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.312632 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-npvsn"] Mar 20 09:33:58 crc kubenswrapper[4958]: E0320 09:33:58.313424 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01" containerName="oc" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.313438 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01" containerName="oc" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.313634 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01" containerName="oc" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.315005 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.322558 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-utilities\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.322701 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-catalog-content\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.322839 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v88qk\" (UniqueName: \"kubernetes.io/projected/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-kube-api-access-v88qk\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.331244 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npvsn"] Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.424891 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v88qk\" (UniqueName: \"kubernetes.io/projected/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-kube-api-access-v88qk\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.425141 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-utilities\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.425198 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-catalog-content\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.426070 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-catalog-content\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.426095 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-utilities\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.454671 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v88qk\" (UniqueName: \"kubernetes.io/projected/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-kube-api-access-v88qk\") pod \"redhat-operators-npvsn\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:58 crc kubenswrapper[4958]: I0320 09:33:58.645948 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:33:59 crc kubenswrapper[4958]: I0320 09:33:59.097186 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-npvsn"] Mar 20 09:33:59 crc kubenswrapper[4958]: I0320 09:33:59.454421 4958 generic.go:334] "Generic (PLEG): container finished" podID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerID="41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e" exitCode=0 Mar 20 09:33:59 crc kubenswrapper[4958]: I0320 09:33:59.454545 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerDied","Data":"41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e"} Mar 20 09:33:59 crc kubenswrapper[4958]: I0320 09:33:59.454916 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerStarted","Data":"8ba9e4e0157eea9bfc61442ced57a5e3a06f5f0cc78a38f220bc411f41c67e87"} Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.150138 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566654-mk4r8"] Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.152494 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.157672 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.157764 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.157695 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.165914 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-mk4r8"] Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.255425 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlv8m\" (UniqueName: \"kubernetes.io/projected/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2-kube-api-access-qlv8m\") pod \"auto-csr-approver-29566654-mk4r8\" (UID: \"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2\") " pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.356896 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlv8m\" (UniqueName: \"kubernetes.io/projected/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2-kube-api-access-qlv8m\") pod \"auto-csr-approver-29566654-mk4r8\" (UID: \"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2\") " pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.380382 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlv8m\" (UniqueName: \"kubernetes.io/projected/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2-kube-api-access-qlv8m\") pod \"auto-csr-approver-29566654-mk4r8\" (UID: \"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2\") " pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.467097 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerStarted","Data":"c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247"} Mar 20 09:34:00 crc kubenswrapper[4958]: I0320 09:34:00.511895 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:01 crc kubenswrapper[4958]: I0320 09:34:01.025244 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-mk4r8"] Mar 20 09:34:01 crc kubenswrapper[4958]: I0320 09:34:01.475278 4958 generic.go:334] "Generic (PLEG): container finished" podID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerID="c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247" exitCode=0 Mar 20 09:34:01 crc kubenswrapper[4958]: I0320 09:34:01.475378 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerDied","Data":"c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247"} Mar 20 09:34:01 crc kubenswrapper[4958]: I0320 09:34:01.476976 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" event={"ID":"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2","Type":"ContainerStarted","Data":"8a76ea62911559f61beac0e6445d8af913df97f58acff1e3270da2dfe9a101cc"} Mar 20 09:34:02 crc kubenswrapper[4958]: I0320 09:34:02.487784 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerStarted","Data":"3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72"} Mar 20 09:34:02 crc kubenswrapper[4958]: I0320 09:34:02.513748 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-npvsn" podStartSLOduration=2.044075259 podStartE2EDuration="4.513720267s" podCreationTimestamp="2026-03-20 09:33:58 +0000 UTC" firstStartedPulling="2026-03-20 09:33:59.456301758 +0000 UTC m=+2059.778317716" lastFinishedPulling="2026-03-20 09:34:01.925946766 +0000 UTC m=+2062.247962724" observedRunningTime="2026-03-20 09:34:02.510853897 +0000 UTC m=+2062.832869865" watchObservedRunningTime="2026-03-20 09:34:02.513720267 +0000 UTC m=+2062.835736225" Mar 20 09:34:03 crc kubenswrapper[4958]: E0320 09:34:03.271621 4958 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa4e46e4_3330_4be2_b41f_9d39ae7b85e2.slice/crio-c9c6aa7fffeae132d26504f29963273335587e6dd251a9222347992e72a7e6df.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfa4e46e4_3330_4be2_b41f_9d39ae7b85e2.slice/crio-conmon-c9c6aa7fffeae132d26504f29963273335587e6dd251a9222347992e72a7e6df.scope\": RecentStats: unable to find data in memory cache]" Mar 20 09:34:03 crc kubenswrapper[4958]: I0320 09:34:03.499840 4958 generic.go:334] "Generic (PLEG): container finished" podID="fa4e46e4-3330-4be2-b41f-9d39ae7b85e2" containerID="c9c6aa7fffeae132d26504f29963273335587e6dd251a9222347992e72a7e6df" exitCode=0 Mar 20 09:34:03 crc kubenswrapper[4958]: I0320 09:34:03.499963 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" event={"ID":"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2","Type":"ContainerDied","Data":"c9c6aa7fffeae132d26504f29963273335587e6dd251a9222347992e72a7e6df"} Mar 20 09:34:04 crc kubenswrapper[4958]: I0320 09:34:04.793038 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:04 crc kubenswrapper[4958]: I0320 09:34:04.944034 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlv8m\" (UniqueName: \"kubernetes.io/projected/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2-kube-api-access-qlv8m\") pod \"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2\" (UID: \"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2\") " Mar 20 09:34:04 crc kubenswrapper[4958]: I0320 09:34:04.956259 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2-kube-api-access-qlv8m" (OuterVolumeSpecName: "kube-api-access-qlv8m") pod "fa4e46e4-3330-4be2-b41f-9d39ae7b85e2" (UID: "fa4e46e4-3330-4be2-b41f-9d39ae7b85e2"). InnerVolumeSpecName "kube-api-access-qlv8m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:05 crc kubenswrapper[4958]: I0320 09:34:05.046044 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlv8m\" (UniqueName: \"kubernetes.io/projected/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2-kube-api-access-qlv8m\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:05 crc kubenswrapper[4958]: I0320 09:34:05.519538 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" event={"ID":"fa4e46e4-3330-4be2-b41f-9d39ae7b85e2","Type":"ContainerDied","Data":"8a76ea62911559f61beac0e6445d8af913df97f58acff1e3270da2dfe9a101cc"} Mar 20 09:34:05 crc kubenswrapper[4958]: I0320 09:34:05.519591 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8a76ea62911559f61beac0e6445d8af913df97f58acff1e3270da2dfe9a101cc" Mar 20 09:34:05 crc kubenswrapper[4958]: I0320 09:34:05.519671 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566654-mk4r8" Mar 20 09:34:05 crc kubenswrapper[4958]: I0320 09:34:05.865653 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-fv945"] Mar 20 09:34:05 crc kubenswrapper[4958]: I0320 09:34:05.874684 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566648-fv945"] Mar 20 09:34:06 crc kubenswrapper[4958]: I0320 09:34:06.465195 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2f426c3a-7eb6-49eb-ba58-6b83b6b67167" path="/var/lib/kubelet/pods/2f426c3a-7eb6-49eb-ba58-6b83b6b67167/volumes" Mar 20 09:34:08 crc kubenswrapper[4958]: I0320 09:34:08.646871 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:34:08 crc kubenswrapper[4958]: I0320 09:34:08.647522 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:34:08 crc kubenswrapper[4958]: I0320 09:34:08.706086 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:34:09 crc kubenswrapper[4958]: I0320 09:34:09.632788 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:34:09 crc kubenswrapper[4958]: I0320 09:34:09.695998 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npvsn"] Mar 20 09:34:11 crc kubenswrapper[4958]: I0320 09:34:11.581398 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-npvsn" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="registry-server" containerID="cri-o://3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72" gracePeriod=2 Mar 20 09:34:11 crc kubenswrapper[4958]: I0320 09:34:11.971940 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.165963 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v88qk\" (UniqueName: \"kubernetes.io/projected/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-kube-api-access-v88qk\") pod \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.166058 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-utilities\") pod \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.166212 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-catalog-content\") pod \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\" (UID: \"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6\") " Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.167326 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-utilities" (OuterVolumeSpecName: "utilities") pod "d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" (UID: "d0dd3c50-b03a-4d63-98b2-9c875e3c62d6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.172925 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-kube-api-access-v88qk" (OuterVolumeSpecName: "kube-api-access-v88qk") pod "d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" (UID: "d0dd3c50-b03a-4d63-98b2-9c875e3c62d6"). InnerVolumeSpecName "kube-api-access-v88qk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.267892 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v88qk\" (UniqueName: \"kubernetes.io/projected/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-kube-api-access-v88qk\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.267930 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.299797 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" (UID: "d0dd3c50-b03a-4d63-98b2-9c875e3c62d6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.369948 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.600687 4958 generic.go:334] "Generic (PLEG): container finished" podID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerID="3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72" exitCode=0 Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.600731 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerDied","Data":"3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72"} Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.600768 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-npvsn" event={"ID":"d0dd3c50-b03a-4d63-98b2-9c875e3c62d6","Type":"ContainerDied","Data":"8ba9e4e0157eea9bfc61442ced57a5e3a06f5f0cc78a38f220bc411f41c67e87"} Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.600792 4958 scope.go:117] "RemoveContainer" containerID="3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.600882 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-npvsn" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.623783 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-npvsn"] Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.627956 4958 scope.go:117] "RemoveContainer" containerID="c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.631815 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-npvsn"] Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.652158 4958 scope.go:117] "RemoveContainer" containerID="41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.676823 4958 scope.go:117] "RemoveContainer" containerID="3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72" Mar 20 09:34:12 crc kubenswrapper[4958]: E0320 09:34:12.677401 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72\": container with ID starting with 3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72 not found: ID does not exist" containerID="3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.677444 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72"} err="failed to get container status \"3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72\": rpc error: code = NotFound desc = could not find container \"3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72\": container with ID starting with 3cc28146bbfd608267b113092b30356642ef16eca626b4ebd8a6dd60af132b72 not found: ID does not exist" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.677474 4958 scope.go:117] "RemoveContainer" containerID="c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247" Mar 20 09:34:12 crc kubenswrapper[4958]: E0320 09:34:12.677798 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247\": container with ID starting with c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247 not found: ID does not exist" containerID="c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.677825 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247"} err="failed to get container status \"c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247\": rpc error: code = NotFound desc = could not find container \"c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247\": container with ID starting with c01d4d1aeec1e59cfddf904012d423e2bc20c8ad34de0844111e84421095a247 not found: ID does not exist" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.677840 4958 scope.go:117] "RemoveContainer" containerID="41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e" Mar 20 09:34:12 crc kubenswrapper[4958]: E0320 09:34:12.678126 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e\": container with ID starting with 41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e not found: ID does not exist" containerID="41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e" Mar 20 09:34:12 crc kubenswrapper[4958]: I0320 09:34:12.678154 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e"} err="failed to get container status \"41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e\": rpc error: code = NotFound desc = could not find container \"41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e\": container with ID starting with 41188d4754f6227870ce7ecfea526f85a5d7bbe8fe848b46aac0391eb7c06d7e not found: ID does not exist" Mar 20 09:34:14 crc kubenswrapper[4958]: I0320 09:34:14.533765 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" path="/var/lib/kubelet/pods/d0dd3c50-b03a-4d63-98b2-9c875e3c62d6/volumes" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.254737 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gtgwn"] Mar 20 09:34:21 crc kubenswrapper[4958]: E0320 09:34:21.255904 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="extract-utilities" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.255925 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="extract-utilities" Mar 20 09:34:21 crc kubenswrapper[4958]: E0320 09:34:21.255944 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fa4e46e4-3330-4be2-b41f-9d39ae7b85e2" containerName="oc" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.255955 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fa4e46e4-3330-4be2-b41f-9d39ae7b85e2" containerName="oc" Mar 20 09:34:21 crc kubenswrapper[4958]: E0320 09:34:21.255968 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="extract-content" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.255977 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="extract-content" Mar 20 09:34:21 crc kubenswrapper[4958]: E0320 09:34:21.256004 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="registry-server" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.256013 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="registry-server" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.258406 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0dd3c50-b03a-4d63-98b2-9c875e3c62d6" containerName="registry-server" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.258432 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fa4e46e4-3330-4be2-b41f-9d39ae7b85e2" containerName="oc" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.260094 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.267407 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtgwn"] Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.425864 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkc7z\" (UniqueName: \"kubernetes.io/projected/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-kube-api-access-dkc7z\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.426178 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-catalog-content\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.426758 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-utilities\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.529481 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-utilities\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.530119 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-utilities\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.530750 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkc7z\" (UniqueName: \"kubernetes.io/projected/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-kube-api-access-dkc7z\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.531011 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-catalog-content\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.531476 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-catalog-content\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.555760 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkc7z\" (UniqueName: \"kubernetes.io/projected/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-kube-api-access-dkc7z\") pod \"community-operators-gtgwn\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.593096 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:21 crc kubenswrapper[4958]: I0320 09:34:21.934201 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gtgwn"] Mar 20 09:34:22 crc kubenswrapper[4958]: I0320 09:34:22.686143 4958 generic.go:334] "Generic (PLEG): container finished" podID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerID="fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817" exitCode=0 Mar 20 09:34:22 crc kubenswrapper[4958]: I0320 09:34:22.686297 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerDied","Data":"fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817"} Mar 20 09:34:22 crc kubenswrapper[4958]: I0320 09:34:22.686657 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerStarted","Data":"0b48d083afd6654f9a227b1150f7773f6cbb51259d9648b1a888fc60b4282d5f"} Mar 20 09:34:23 crc kubenswrapper[4958]: I0320 09:34:23.695663 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerStarted","Data":"8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446"} Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.057062 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-5sr57"] Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.062329 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.066680 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sr57"] Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.071183 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-266sk\" (UniqueName: \"kubernetes.io/projected/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-kube-api-access-266sk\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.071266 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-utilities\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.071328 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-catalog-content\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.172401 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-utilities\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.172505 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-catalog-content\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.172614 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-266sk\" (UniqueName: \"kubernetes.io/projected/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-kube-api-access-266sk\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.172974 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-utilities\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.173322 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-catalog-content\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.201478 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-266sk\" (UniqueName: \"kubernetes.io/projected/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-kube-api-access-266sk\") pod \"redhat-marketplace-5sr57\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.389013 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.660566 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sr57"] Mar 20 09:34:24 crc kubenswrapper[4958]: W0320 09:34:24.674755 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode34bfaa6_0cd5_40f9_bb6d_8da56938cc98.slice/crio-86fa1c921fa51d944c4fb39c4f67cd7fb6f7b690b7b964a1c92e944c6475c8dc WatchSource:0}: Error finding container 86fa1c921fa51d944c4fb39c4f67cd7fb6f7b690b7b964a1c92e944c6475c8dc: Status 404 returned error can't find the container with id 86fa1c921fa51d944c4fb39c4f67cd7fb6f7b690b7b964a1c92e944c6475c8dc Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.722984 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerStarted","Data":"86fa1c921fa51d944c4fb39c4f67cd7fb6f7b690b7b964a1c92e944c6475c8dc"} Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.725896 4958 generic.go:334] "Generic (PLEG): container finished" podID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerID="8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446" exitCode=0 Mar 20 09:34:24 crc kubenswrapper[4958]: I0320 09:34:24.725967 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerDied","Data":"8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446"} Mar 20 09:34:25 crc kubenswrapper[4958]: I0320 09:34:25.736223 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerStarted","Data":"142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70"} Mar 20 09:34:25 crc kubenswrapper[4958]: I0320 09:34:25.738263 4958 generic.go:334] "Generic (PLEG): container finished" podID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerID="b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9" exitCode=0 Mar 20 09:34:25 crc kubenswrapper[4958]: I0320 09:34:25.738306 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerDied","Data":"b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9"} Mar 20 09:34:25 crc kubenswrapper[4958]: I0320 09:34:25.760438 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gtgwn" podStartSLOduration=2.300809907 podStartE2EDuration="4.760412348s" podCreationTimestamp="2026-03-20 09:34:21 +0000 UTC" firstStartedPulling="2026-03-20 09:34:22.687554405 +0000 UTC m=+2083.009570363" lastFinishedPulling="2026-03-20 09:34:25.147156846 +0000 UTC m=+2085.469172804" observedRunningTime="2026-03-20 09:34:25.755614625 +0000 UTC m=+2086.077630583" watchObservedRunningTime="2026-03-20 09:34:25.760412348 +0000 UTC m=+2086.082428306" Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.521525 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.522129 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.522500 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.523390 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ee5c926d8da62a10bc26c348340bfe357e037f11fb90ffa62417af57f07e12c2"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.523496 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://ee5c926d8da62a10bc26c348340bfe357e037f11fb90ffa62417af57f07e12c2" gracePeriod=600 Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.747282 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerStarted","Data":"bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f"} Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.752240 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="ee5c926d8da62a10bc26c348340bfe357e037f11fb90ffa62417af57f07e12c2" exitCode=0 Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.752337 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"ee5c926d8da62a10bc26c348340bfe357e037f11fb90ffa62417af57f07e12c2"} Mar 20 09:34:26 crc kubenswrapper[4958]: I0320 09:34:26.752479 4958 scope.go:117] "RemoveContainer" containerID="637b689e91d858dfe24c504ab190b8fef6bc1bf87f4e830c5b38057e16b0bc43" Mar 20 09:34:27 crc kubenswrapper[4958]: I0320 09:34:27.763424 4958 generic.go:334] "Generic (PLEG): container finished" podID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerID="bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f" exitCode=0 Mar 20 09:34:27 crc kubenswrapper[4958]: I0320 09:34:27.763558 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerDied","Data":"bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f"} Mar 20 09:34:27 crc kubenswrapper[4958]: I0320 09:34:27.766951 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081"} Mar 20 09:34:28 crc kubenswrapper[4958]: I0320 09:34:28.778181 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerStarted","Data":"86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c"} Mar 20 09:34:28 crc kubenswrapper[4958]: I0320 09:34:28.799854 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-5sr57" podStartSLOduration=2.385230618 podStartE2EDuration="4.799833969s" podCreationTimestamp="2026-03-20 09:34:24 +0000 UTC" firstStartedPulling="2026-03-20 09:34:25.739920423 +0000 UTC m=+2086.061936381" lastFinishedPulling="2026-03-20 09:34:28.154523774 +0000 UTC m=+2088.476539732" observedRunningTime="2026-03-20 09:34:28.799079189 +0000 UTC m=+2089.121095177" watchObservedRunningTime="2026-03-20 09:34:28.799833969 +0000 UTC m=+2089.121849927" Mar 20 09:34:31 crc kubenswrapper[4958]: I0320 09:34:31.601715 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:31 crc kubenswrapper[4958]: I0320 09:34:31.602997 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:31 crc kubenswrapper[4958]: I0320 09:34:31.680471 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:31 crc kubenswrapper[4958]: I0320 09:34:31.838561 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:33 crc kubenswrapper[4958]: I0320 09:34:33.040823 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gtgwn"] Mar 20 09:34:34 crc kubenswrapper[4958]: I0320 09:34:34.389514 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:34 crc kubenswrapper[4958]: I0320 09:34:34.390024 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:34 crc kubenswrapper[4958]: I0320 09:34:34.447375 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:34 crc kubenswrapper[4958]: I0320 09:34:34.826102 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gtgwn" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="registry-server" containerID="cri-o://142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70" gracePeriod=2 Mar 20 09:34:34 crc kubenswrapper[4958]: I0320 09:34:34.878129 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.262172 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.349469 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-utilities\") pod \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.349929 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkc7z\" (UniqueName: \"kubernetes.io/projected/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-kube-api-access-dkc7z\") pod \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.350271 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-catalog-content\") pod \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\" (UID: \"930c07ab-30e6-45ff-9ac3-08dcf0785b0e\") " Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.350752 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-utilities" (OuterVolumeSpecName: "utilities") pod "930c07ab-30e6-45ff-9ac3-08dcf0785b0e" (UID: "930c07ab-30e6-45ff-9ac3-08dcf0785b0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.356965 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-kube-api-access-dkc7z" (OuterVolumeSpecName: "kube-api-access-dkc7z") pod "930c07ab-30e6-45ff-9ac3-08dcf0785b0e" (UID: "930c07ab-30e6-45ff-9ac3-08dcf0785b0e"). InnerVolumeSpecName "kube-api-access-dkc7z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.408698 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "930c07ab-30e6-45ff-9ac3-08dcf0785b0e" (UID: "930c07ab-30e6-45ff-9ac3-08dcf0785b0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.442498 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sr57"] Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.458396 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.458440 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.458453 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkc7z\" (UniqueName: \"kubernetes.io/projected/930c07ab-30e6-45ff-9ac3-08dcf0785b0e-kube-api-access-dkc7z\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.841034 4958 generic.go:334] "Generic (PLEG): container finished" podID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerID="142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70" exitCode=0 Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.841093 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerDied","Data":"142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70"} Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.841564 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gtgwn" event={"ID":"930c07ab-30e6-45ff-9ac3-08dcf0785b0e","Type":"ContainerDied","Data":"0b48d083afd6654f9a227b1150f7773f6cbb51259d9648b1a888fc60b4282d5f"} Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.841130 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gtgwn" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.841593 4958 scope.go:117] "RemoveContainer" containerID="142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.862741 4958 scope.go:117] "RemoveContainer" containerID="8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.878619 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gtgwn"] Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.885322 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gtgwn"] Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.900588 4958 scope.go:117] "RemoveContainer" containerID="fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.924222 4958 scope.go:117] "RemoveContainer" containerID="142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70" Mar 20 09:34:35 crc kubenswrapper[4958]: E0320 09:34:35.925055 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70\": container with ID starting with 142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70 not found: ID does not exist" containerID="142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.925130 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70"} err="failed to get container status \"142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70\": rpc error: code = NotFound desc = could not find container \"142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70\": container with ID starting with 142da8547c491608920e90bdbf9ebab339bec04dc47b6807f0cebab2ebacda70 not found: ID does not exist" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.925167 4958 scope.go:117] "RemoveContainer" containerID="8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446" Mar 20 09:34:35 crc kubenswrapper[4958]: E0320 09:34:35.925809 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446\": container with ID starting with 8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446 not found: ID does not exist" containerID="8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.925867 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446"} err="failed to get container status \"8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446\": rpc error: code = NotFound desc = could not find container \"8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446\": container with ID starting with 8dd561ea55bf10d1c2994a2509265fc45cf99d5403d915fef55ff9928c0a2446 not found: ID does not exist" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.925906 4958 scope.go:117] "RemoveContainer" containerID="fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817" Mar 20 09:34:35 crc kubenswrapper[4958]: E0320 09:34:35.926206 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817\": container with ID starting with fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817 not found: ID does not exist" containerID="fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817" Mar 20 09:34:35 crc kubenswrapper[4958]: I0320 09:34:35.926237 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817"} err="failed to get container status \"fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817\": rpc error: code = NotFound desc = could not find container \"fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817\": container with ID starting with fe8054019dd68585f418828da8e1a819d3a6a039431bd0d67108a8c6ea2eb817 not found: ID does not exist" Mar 20 09:34:36 crc kubenswrapper[4958]: I0320 09:34:36.444083 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" path="/var/lib/kubelet/pods/930c07ab-30e6-45ff-9ac3-08dcf0785b0e/volumes" Mar 20 09:34:36 crc kubenswrapper[4958]: I0320 09:34:36.850293 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-5sr57" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="registry-server" containerID="cri-o://86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c" gracePeriod=2 Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.251759 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.390868 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-catalog-content\") pod \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.390975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-266sk\" (UniqueName: \"kubernetes.io/projected/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-kube-api-access-266sk\") pod \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.391012 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-utilities\") pod \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\" (UID: \"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98\") " Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.392193 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-utilities" (OuterVolumeSpecName: "utilities") pod "e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" (UID: "e34bfaa6-0cd5-40f9-bb6d-8da56938cc98"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.398134 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-kube-api-access-266sk" (OuterVolumeSpecName: "kube-api-access-266sk") pod "e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" (UID: "e34bfaa6-0cd5-40f9-bb6d-8da56938cc98"). InnerVolumeSpecName "kube-api-access-266sk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.418579 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" (UID: "e34bfaa6-0cd5-40f9-bb6d-8da56938cc98"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.492539 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.493002 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-266sk\" (UniqueName: \"kubernetes.io/projected/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-kube-api-access-266sk\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.493019 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.861531 4958 generic.go:334] "Generic (PLEG): container finished" podID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerID="86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c" exitCode=0 Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.861756 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerDied","Data":"86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c"} Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.861807 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-5sr57" event={"ID":"e34bfaa6-0cd5-40f9-bb6d-8da56938cc98","Type":"ContainerDied","Data":"86fa1c921fa51d944c4fb39c4f67cd7fb6f7b690b7b964a1c92e944c6475c8dc"} Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.861837 4958 scope.go:117] "RemoveContainer" containerID="86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.862019 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-5sr57" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.881647 4958 scope.go:117] "RemoveContainer" containerID="bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.902787 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sr57"] Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.905241 4958 scope.go:117] "RemoveContainer" containerID="b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.907772 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-5sr57"] Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.942009 4958 scope.go:117] "RemoveContainer" containerID="86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c" Mar 20 09:34:37 crc kubenswrapper[4958]: E0320 09:34:37.942774 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c\": container with ID starting with 86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c not found: ID does not exist" containerID="86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.942807 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c"} err="failed to get container status \"86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c\": rpc error: code = NotFound desc = could not find container \"86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c\": container with ID starting with 86f1b7acec803789d58f41c294b3ce6ddbb300f8694dc1edc10f6b928654ed2c not found: ID does not exist" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.942831 4958 scope.go:117] "RemoveContainer" containerID="bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f" Mar 20 09:34:37 crc kubenswrapper[4958]: E0320 09:34:37.943450 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f\": container with ID starting with bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f not found: ID does not exist" containerID="bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.943478 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f"} err="failed to get container status \"bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f\": rpc error: code = NotFound desc = could not find container \"bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f\": container with ID starting with bd8ce6b23d05decaadcc4361f50d401e3ade075f956c875a82737057858b327f not found: ID does not exist" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.943492 4958 scope.go:117] "RemoveContainer" containerID="b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9" Mar 20 09:34:37 crc kubenswrapper[4958]: E0320 09:34:37.943882 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9\": container with ID starting with b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9 not found: ID does not exist" containerID="b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9" Mar 20 09:34:37 crc kubenswrapper[4958]: I0320 09:34:37.944007 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9"} err="failed to get container status \"b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9\": rpc error: code = NotFound desc = could not find container \"b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9\": container with ID starting with b5c742602fee010202928abebbdd552eb6d52c44752a93bd72fd7aebfff86ad9 not found: ID does not exist" Mar 20 09:34:38 crc kubenswrapper[4958]: I0320 09:34:38.446220 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" path="/var/lib/kubelet/pods/e34bfaa6-0cd5-40f9-bb6d-8da56938cc98/volumes" Mar 20 09:34:44 crc kubenswrapper[4958]: I0320 09:34:44.120439 4958 scope.go:117] "RemoveContainer" containerID="c3bbebb08e3de0dce8534c1086e4e53b98b155990ee5442ceb2bc2360863f0f9" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.158560 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566656-8jtqk"] Mar 20 09:36:00 crc kubenswrapper[4958]: E0320 09:36:00.159892 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="registry-server" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.159909 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="registry-server" Mar 20 09:36:00 crc kubenswrapper[4958]: E0320 09:36:00.159931 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="extract-utilities" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.159938 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="extract-utilities" Mar 20 09:36:00 crc kubenswrapper[4958]: E0320 09:36:00.159951 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="extract-content" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.159957 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="extract-content" Mar 20 09:36:00 crc kubenswrapper[4958]: E0320 09:36:00.159971 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="registry-server" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.159977 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="registry-server" Mar 20 09:36:00 crc kubenswrapper[4958]: E0320 09:36:00.159987 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="extract-content" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.159993 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="extract-content" Mar 20 09:36:00 crc kubenswrapper[4958]: E0320 09:36:00.160004 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="extract-utilities" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.160010 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="extract-utilities" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.160187 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e34bfaa6-0cd5-40f9-bb6d-8da56938cc98" containerName="registry-server" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.160200 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="930c07ab-30e6-45ff-9ac3-08dcf0785b0e" containerName="registry-server" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.160835 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.163883 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.164250 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.171636 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.173678 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-8jtqk"] Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.316088 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6lxcm\" (UniqueName: \"kubernetes.io/projected/c085a2ef-2dbe-4f49-b250-e39107f4ed13-kube-api-access-6lxcm\") pod \"auto-csr-approver-29566656-8jtqk\" (UID: \"c085a2ef-2dbe-4f49-b250-e39107f4ed13\") " pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.418121 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6lxcm\" (UniqueName: \"kubernetes.io/projected/c085a2ef-2dbe-4f49-b250-e39107f4ed13-kube-api-access-6lxcm\") pod \"auto-csr-approver-29566656-8jtqk\" (UID: \"c085a2ef-2dbe-4f49-b250-e39107f4ed13\") " pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.438839 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6lxcm\" (UniqueName: \"kubernetes.io/projected/c085a2ef-2dbe-4f49-b250-e39107f4ed13-kube-api-access-6lxcm\") pod \"auto-csr-approver-29566656-8jtqk\" (UID: \"c085a2ef-2dbe-4f49-b250-e39107f4ed13\") " pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.518693 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.974753 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-8jtqk"] Mar 20 09:36:00 crc kubenswrapper[4958]: I0320 09:36:00.985432 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:36:01 crc kubenswrapper[4958]: I0320 09:36:01.522068 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" event={"ID":"c085a2ef-2dbe-4f49-b250-e39107f4ed13","Type":"ContainerStarted","Data":"5db640f776113ee66a81c1100f16a237fd7d74e60c0f2b853bc6b4c2f54b0c12"} Mar 20 09:36:02 crc kubenswrapper[4958]: I0320 09:36:02.530945 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" event={"ID":"c085a2ef-2dbe-4f49-b250-e39107f4ed13","Type":"ContainerStarted","Data":"5869a3070d2a9937efcb0e005cf6bb00c0c3d2a0f58e9ae28268c49d427ef42a"} Mar 20 09:36:02 crc kubenswrapper[4958]: I0320 09:36:02.553862 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" podStartSLOduration=1.4604609800000001 podStartE2EDuration="2.553834807s" podCreationTimestamp="2026-03-20 09:36:00 +0000 UTC" firstStartedPulling="2026-03-20 09:36:00.984891299 +0000 UTC m=+2181.306907297" lastFinishedPulling="2026-03-20 09:36:02.078265166 +0000 UTC m=+2182.400281124" observedRunningTime="2026-03-20 09:36:02.547841141 +0000 UTC m=+2182.869857089" watchObservedRunningTime="2026-03-20 09:36:02.553834807 +0000 UTC m=+2182.875850785" Mar 20 09:36:03 crc kubenswrapper[4958]: I0320 09:36:03.540530 4958 generic.go:334] "Generic (PLEG): container finished" podID="c085a2ef-2dbe-4f49-b250-e39107f4ed13" containerID="5869a3070d2a9937efcb0e005cf6bb00c0c3d2a0f58e9ae28268c49d427ef42a" exitCode=0 Mar 20 09:36:03 crc kubenswrapper[4958]: I0320 09:36:03.540653 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" event={"ID":"c085a2ef-2dbe-4f49-b250-e39107f4ed13","Type":"ContainerDied","Data":"5869a3070d2a9937efcb0e005cf6bb00c0c3d2a0f58e9ae28268c49d427ef42a"} Mar 20 09:36:04 crc kubenswrapper[4958]: I0320 09:36:04.816356 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:04 crc kubenswrapper[4958]: I0320 09:36:04.900486 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6lxcm\" (UniqueName: \"kubernetes.io/projected/c085a2ef-2dbe-4f49-b250-e39107f4ed13-kube-api-access-6lxcm\") pod \"c085a2ef-2dbe-4f49-b250-e39107f4ed13\" (UID: \"c085a2ef-2dbe-4f49-b250-e39107f4ed13\") " Mar 20 09:36:04 crc kubenswrapper[4958]: I0320 09:36:04.908817 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c085a2ef-2dbe-4f49-b250-e39107f4ed13-kube-api-access-6lxcm" (OuterVolumeSpecName: "kube-api-access-6lxcm") pod "c085a2ef-2dbe-4f49-b250-e39107f4ed13" (UID: "c085a2ef-2dbe-4f49-b250-e39107f4ed13"). InnerVolumeSpecName "kube-api-access-6lxcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:36:05 crc kubenswrapper[4958]: I0320 09:36:05.003490 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6lxcm\" (UniqueName: \"kubernetes.io/projected/c085a2ef-2dbe-4f49-b250-e39107f4ed13-kube-api-access-6lxcm\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:05 crc kubenswrapper[4958]: I0320 09:36:05.559504 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" event={"ID":"c085a2ef-2dbe-4f49-b250-e39107f4ed13","Type":"ContainerDied","Data":"5db640f776113ee66a81c1100f16a237fd7d74e60c0f2b853bc6b4c2f54b0c12"} Mar 20 09:36:05 crc kubenswrapper[4958]: I0320 09:36:05.559853 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5db640f776113ee66a81c1100f16a237fd7d74e60c0f2b853bc6b4c2f54b0c12" Mar 20 09:36:05 crc kubenswrapper[4958]: I0320 09:36:05.559614 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566656-8jtqk" Mar 20 09:36:05 crc kubenswrapper[4958]: I0320 09:36:05.632979 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-9srq9"] Mar 20 09:36:05 crc kubenswrapper[4958]: I0320 09:36:05.638530 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566650-9srq9"] Mar 20 09:36:06 crc kubenswrapper[4958]: I0320 09:36:06.449650 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b41a815-9af6-4747-a1b8-69b98ec2dafe" path="/var/lib/kubelet/pods/4b41a815-9af6-4747-a1b8-69b98ec2dafe/volumes" Mar 20 09:36:26 crc kubenswrapper[4958]: I0320 09:36:26.520815 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:36:26 crc kubenswrapper[4958]: I0320 09:36:26.521751 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:36:44 crc kubenswrapper[4958]: I0320 09:36:44.279530 4958 scope.go:117] "RemoveContainer" containerID="b8bc08d93e928b428d371eebe33224513a75314307bd2853ba583531a7a95bc7" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.563840 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dxs85"] Mar 20 09:36:45 crc kubenswrapper[4958]: E0320 09:36:45.564836 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c085a2ef-2dbe-4f49-b250-e39107f4ed13" containerName="oc" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.564857 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c085a2ef-2dbe-4f49-b250-e39107f4ed13" containerName="oc" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.565085 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c085a2ef-2dbe-4f49-b250-e39107f4ed13" containerName="oc" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.566500 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.580438 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxs85"] Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.672765 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-utilities\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.672818 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9nbf\" (UniqueName: \"kubernetes.io/projected/11986b63-a61e-4984-b36f-bb2da7159166-kube-api-access-d9nbf\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.673387 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-catalog-content\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.775420 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-utilities\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.775931 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9nbf\" (UniqueName: \"kubernetes.io/projected/11986b63-a61e-4984-b36f-bb2da7159166-kube-api-access-d9nbf\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.776054 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-catalog-content\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.776113 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-utilities\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.776528 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-catalog-content\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.803236 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9nbf\" (UniqueName: \"kubernetes.io/projected/11986b63-a61e-4984-b36f-bb2da7159166-kube-api-access-d9nbf\") pod \"certified-operators-dxs85\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:45 crc kubenswrapper[4958]: I0320 09:36:45.889207 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:46 crc kubenswrapper[4958]: I0320 09:36:46.445977 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dxs85"] Mar 20 09:36:46 crc kubenswrapper[4958]: I0320 09:36:46.909168 4958 generic.go:334] "Generic (PLEG): container finished" podID="11986b63-a61e-4984-b36f-bb2da7159166" containerID="83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f" exitCode=0 Mar 20 09:36:46 crc kubenswrapper[4958]: I0320 09:36:46.909280 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxs85" event={"ID":"11986b63-a61e-4984-b36f-bb2da7159166","Type":"ContainerDied","Data":"83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f"} Mar 20 09:36:46 crc kubenswrapper[4958]: I0320 09:36:46.909579 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxs85" event={"ID":"11986b63-a61e-4984-b36f-bb2da7159166","Type":"ContainerStarted","Data":"992acef36b04097993e2220582a1e88e88c7ce5e96199a7f0fff720e10f3738f"} Mar 20 09:36:48 crc kubenswrapper[4958]: I0320 09:36:48.931550 4958 generic.go:334] "Generic (PLEG): container finished" podID="11986b63-a61e-4984-b36f-bb2da7159166" containerID="f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd" exitCode=0 Mar 20 09:36:48 crc kubenswrapper[4958]: I0320 09:36:48.931652 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxs85" event={"ID":"11986b63-a61e-4984-b36f-bb2da7159166","Type":"ContainerDied","Data":"f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd"} Mar 20 09:36:51 crc kubenswrapper[4958]: I0320 09:36:51.964892 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxs85" event={"ID":"11986b63-a61e-4984-b36f-bb2da7159166","Type":"ContainerStarted","Data":"5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c"} Mar 20 09:36:51 crc kubenswrapper[4958]: I0320 09:36:51.990952 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dxs85" podStartSLOduration=2.9345321 podStartE2EDuration="6.990922675s" podCreationTimestamp="2026-03-20 09:36:45 +0000 UTC" firstStartedPulling="2026-03-20 09:36:46.911757507 +0000 UTC m=+2227.233773465" lastFinishedPulling="2026-03-20 09:36:50.968148072 +0000 UTC m=+2231.290164040" observedRunningTime="2026-03-20 09:36:51.988566119 +0000 UTC m=+2232.310582107" watchObservedRunningTime="2026-03-20 09:36:51.990922675 +0000 UTC m=+2232.312938673" Mar 20 09:36:55 crc kubenswrapper[4958]: I0320 09:36:55.890074 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:55 crc kubenswrapper[4958]: I0320 09:36:55.890525 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:55 crc kubenswrapper[4958]: I0320 09:36:55.936963 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:56 crc kubenswrapper[4958]: I0320 09:36:56.072533 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:56 crc kubenswrapper[4958]: I0320 09:36:56.181550 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxs85"] Mar 20 09:36:56 crc kubenswrapper[4958]: I0320 09:36:56.521930 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:36:56 crc kubenswrapper[4958]: I0320 09:36:56.522040 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.029791 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dxs85" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="registry-server" containerID="cri-o://5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c" gracePeriod=2 Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.433309 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.593003 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-utilities\") pod \"11986b63-a61e-4984-b36f-bb2da7159166\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.593074 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d9nbf\" (UniqueName: \"kubernetes.io/projected/11986b63-a61e-4984-b36f-bb2da7159166-kube-api-access-d9nbf\") pod \"11986b63-a61e-4984-b36f-bb2da7159166\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.593106 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-catalog-content\") pod \"11986b63-a61e-4984-b36f-bb2da7159166\" (UID: \"11986b63-a61e-4984-b36f-bb2da7159166\") " Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.594665 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-utilities" (OuterVolumeSpecName: "utilities") pod "11986b63-a61e-4984-b36f-bb2da7159166" (UID: "11986b63-a61e-4984-b36f-bb2da7159166"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.607163 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11986b63-a61e-4984-b36f-bb2da7159166-kube-api-access-d9nbf" (OuterVolumeSpecName: "kube-api-access-d9nbf") pod "11986b63-a61e-4984-b36f-bb2da7159166" (UID: "11986b63-a61e-4984-b36f-bb2da7159166"). InnerVolumeSpecName "kube-api-access-d9nbf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.652836 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "11986b63-a61e-4984-b36f-bb2da7159166" (UID: "11986b63-a61e-4984-b36f-bb2da7159166"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.695268 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.695333 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d9nbf\" (UniqueName: \"kubernetes.io/projected/11986b63-a61e-4984-b36f-bb2da7159166-kube-api-access-d9nbf\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:58 crc kubenswrapper[4958]: I0320 09:36:58.695362 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/11986b63-a61e-4984-b36f-bb2da7159166-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.047869 4958 generic.go:334] "Generic (PLEG): container finished" podID="11986b63-a61e-4984-b36f-bb2da7159166" containerID="5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c" exitCode=0 Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.047925 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxs85" event={"ID":"11986b63-a61e-4984-b36f-bb2da7159166","Type":"ContainerDied","Data":"5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c"} Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.047967 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dxs85" event={"ID":"11986b63-a61e-4984-b36f-bb2da7159166","Type":"ContainerDied","Data":"992acef36b04097993e2220582a1e88e88c7ce5e96199a7f0fff720e10f3738f"} Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.047962 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dxs85" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.047991 4958 scope.go:117] "RemoveContainer" containerID="5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.073098 4958 scope.go:117] "RemoveContainer" containerID="f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.090721 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dxs85"] Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.100673 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dxs85"] Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.106080 4958 scope.go:117] "RemoveContainer" containerID="83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.128699 4958 scope.go:117] "RemoveContainer" containerID="5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c" Mar 20 09:36:59 crc kubenswrapper[4958]: E0320 09:36:59.129242 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c\": container with ID starting with 5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c not found: ID does not exist" containerID="5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.129280 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c"} err="failed to get container status \"5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c\": rpc error: code = NotFound desc = could not find container \"5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c\": container with ID starting with 5df8a2e1d9a6a70e39a324fcd46c4f0fa9ba73e36d734e15b3583d76335ee16c not found: ID does not exist" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.129308 4958 scope.go:117] "RemoveContainer" containerID="f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd" Mar 20 09:36:59 crc kubenswrapper[4958]: E0320 09:36:59.129823 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd\": container with ID starting with f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd not found: ID does not exist" containerID="f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.129847 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd"} err="failed to get container status \"f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd\": rpc error: code = NotFound desc = could not find container \"f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd\": container with ID starting with f7ac7daa8889eec19e9b78805c1bcd02570c59aebeb017579cd15d65f993bcbd not found: ID does not exist" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.129862 4958 scope.go:117] "RemoveContainer" containerID="83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f" Mar 20 09:36:59 crc kubenswrapper[4958]: E0320 09:36:59.130242 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f\": container with ID starting with 83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f not found: ID does not exist" containerID="83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f" Mar 20 09:36:59 crc kubenswrapper[4958]: I0320 09:36:59.130335 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f"} err="failed to get container status \"83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f\": rpc error: code = NotFound desc = could not find container \"83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f\": container with ID starting with 83ea31a0de0deb2012bb9e18807aaf6cb6fa015e937c126029b89e3d3149eb9f not found: ID does not exist" Mar 20 09:37:00 crc kubenswrapper[4958]: I0320 09:37:00.447710 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11986b63-a61e-4984-b36f-bb2da7159166" path="/var/lib/kubelet/pods/11986b63-a61e-4984-b36f-bb2da7159166/volumes" Mar 20 09:37:26 crc kubenswrapper[4958]: I0320 09:37:26.520956 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:37:26 crc kubenswrapper[4958]: I0320 09:37:26.521906 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:37:26 crc kubenswrapper[4958]: I0320 09:37:26.521972 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:37:26 crc kubenswrapper[4958]: I0320 09:37:26.522666 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:37:26 crc kubenswrapper[4958]: I0320 09:37:26.522740 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" gracePeriod=600 Mar 20 09:37:26 crc kubenswrapper[4958]: E0320 09:37:26.645381 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:37:27 crc kubenswrapper[4958]: I0320 09:37:27.307041 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" exitCode=0 Mar 20 09:37:27 crc kubenswrapper[4958]: I0320 09:37:27.307093 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081"} Mar 20 09:37:27 crc kubenswrapper[4958]: I0320 09:37:27.307163 4958 scope.go:117] "RemoveContainer" containerID="ee5c926d8da62a10bc26c348340bfe357e037f11fb90ffa62417af57f07e12c2" Mar 20 09:37:27 crc kubenswrapper[4958]: I0320 09:37:27.308952 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:37:27 crc kubenswrapper[4958]: E0320 09:37:27.309235 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:37:42 crc kubenswrapper[4958]: I0320 09:37:42.436121 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:37:42 crc kubenswrapper[4958]: E0320 09:37:42.437164 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:37:56 crc kubenswrapper[4958]: I0320 09:37:56.435438 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:37:56 crc kubenswrapper[4958]: E0320 09:37:56.436521 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.148997 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566658-qmz92"] Mar 20 09:38:00 crc kubenswrapper[4958]: E0320 09:38:00.149939 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="extract-utilities" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.149957 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="extract-utilities" Mar 20 09:38:00 crc kubenswrapper[4958]: E0320 09:38:00.149982 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="registry-server" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.149992 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="registry-server" Mar 20 09:38:00 crc kubenswrapper[4958]: E0320 09:38:00.150012 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="extract-content" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.150021 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="extract-content" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.150213 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="11986b63-a61e-4984-b36f-bb2da7159166" containerName="registry-server" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.151813 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.155173 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.155200 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.155647 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.160530 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-qmz92"] Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.270212 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mv64\" (UniqueName: \"kubernetes.io/projected/82c53935-88fc-4ab6-8fc4-e31647d93b52-kube-api-access-7mv64\") pod \"auto-csr-approver-29566658-qmz92\" (UID: \"82c53935-88fc-4ab6-8fc4-e31647d93b52\") " pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.372161 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mv64\" (UniqueName: \"kubernetes.io/projected/82c53935-88fc-4ab6-8fc4-e31647d93b52-kube-api-access-7mv64\") pod \"auto-csr-approver-29566658-qmz92\" (UID: \"82c53935-88fc-4ab6-8fc4-e31647d93b52\") " pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.398027 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mv64\" (UniqueName: \"kubernetes.io/projected/82c53935-88fc-4ab6-8fc4-e31647d93b52-kube-api-access-7mv64\") pod \"auto-csr-approver-29566658-qmz92\" (UID: \"82c53935-88fc-4ab6-8fc4-e31647d93b52\") " pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.481730 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:00 crc kubenswrapper[4958]: I0320 09:38:00.971652 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-qmz92"] Mar 20 09:38:01 crc kubenswrapper[4958]: I0320 09:38:01.601516 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566658-qmz92" event={"ID":"82c53935-88fc-4ab6-8fc4-e31647d93b52","Type":"ContainerStarted","Data":"a034839dcd071e8db588e26a01c4aab01736bb6f6115bb7d7732bbe5345bf08a"} Mar 20 09:38:02 crc kubenswrapper[4958]: I0320 09:38:02.611066 4958 generic.go:334] "Generic (PLEG): container finished" podID="82c53935-88fc-4ab6-8fc4-e31647d93b52" containerID="7631be9c8ac8067cfb37530d8bc4f10141107bc46b6b9bbc833499b2c57b466d" exitCode=0 Mar 20 09:38:02 crc kubenswrapper[4958]: I0320 09:38:02.611196 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566658-qmz92" event={"ID":"82c53935-88fc-4ab6-8fc4-e31647d93b52","Type":"ContainerDied","Data":"7631be9c8ac8067cfb37530d8bc4f10141107bc46b6b9bbc833499b2c57b466d"} Mar 20 09:38:03 crc kubenswrapper[4958]: I0320 09:38:03.913944 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.045191 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7mv64\" (UniqueName: \"kubernetes.io/projected/82c53935-88fc-4ab6-8fc4-e31647d93b52-kube-api-access-7mv64\") pod \"82c53935-88fc-4ab6-8fc4-e31647d93b52\" (UID: \"82c53935-88fc-4ab6-8fc4-e31647d93b52\") " Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.051573 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c53935-88fc-4ab6-8fc4-e31647d93b52-kube-api-access-7mv64" (OuterVolumeSpecName: "kube-api-access-7mv64") pod "82c53935-88fc-4ab6-8fc4-e31647d93b52" (UID: "82c53935-88fc-4ab6-8fc4-e31647d93b52"). InnerVolumeSpecName "kube-api-access-7mv64". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.147777 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7mv64\" (UniqueName: \"kubernetes.io/projected/82c53935-88fc-4ab6-8fc4-e31647d93b52-kube-api-access-7mv64\") on node \"crc\" DevicePath \"\"" Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.635263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566658-qmz92" event={"ID":"82c53935-88fc-4ab6-8fc4-e31647d93b52","Type":"ContainerDied","Data":"a034839dcd071e8db588e26a01c4aab01736bb6f6115bb7d7732bbe5345bf08a"} Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.635318 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a034839dcd071e8db588e26a01c4aab01736bb6f6115bb7d7732bbe5345bf08a" Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.635363 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566658-qmz92" Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.989443 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-jxvh6"] Mar 20 09:38:04 crc kubenswrapper[4958]: I0320 09:38:04.994416 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566652-jxvh6"] Mar 20 09:38:06 crc kubenswrapper[4958]: I0320 09:38:06.444417 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01" path="/var/lib/kubelet/pods/8b5611b4-8eb9-47a7-9b05-d5ee33b3ea01/volumes" Mar 20 09:38:10 crc kubenswrapper[4958]: I0320 09:38:10.438847 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:38:10 crc kubenswrapper[4958]: E0320 09:38:10.439373 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:38:23 crc kubenswrapper[4958]: I0320 09:38:23.434992 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:38:23 crc kubenswrapper[4958]: E0320 09:38:23.436106 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:38:36 crc kubenswrapper[4958]: I0320 09:38:36.435977 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:38:36 crc kubenswrapper[4958]: E0320 09:38:36.436823 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:38:44 crc kubenswrapper[4958]: I0320 09:38:44.398286 4958 scope.go:117] "RemoveContainer" containerID="97a34dfd347343cb9968b7afe41cb78a310e74eff4f782e7211de89cb3ae31e1" Mar 20 09:38:51 crc kubenswrapper[4958]: I0320 09:38:51.435688 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:38:51 crc kubenswrapper[4958]: E0320 09:38:51.436460 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:39:03 crc kubenswrapper[4958]: I0320 09:39:03.435166 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:39:03 crc kubenswrapper[4958]: E0320 09:39:03.439020 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:39:18 crc kubenswrapper[4958]: I0320 09:39:18.435457 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:39:18 crc kubenswrapper[4958]: E0320 09:39:18.436531 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:39:32 crc kubenswrapper[4958]: I0320 09:39:32.435474 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:39:32 crc kubenswrapper[4958]: E0320 09:39:32.436393 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:39:45 crc kubenswrapper[4958]: I0320 09:39:45.435276 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:39:45 crc kubenswrapper[4958]: E0320 09:39:45.437537 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:39:57 crc kubenswrapper[4958]: I0320 09:39:57.435405 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:39:57 crc kubenswrapper[4958]: E0320 09:39:57.436634 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.159566 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566660-m5ll8"] Mar 20 09:40:00 crc kubenswrapper[4958]: E0320 09:40:00.160041 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c53935-88fc-4ab6-8fc4-e31647d93b52" containerName="oc" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.160056 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c53935-88fc-4ab6-8fc4-e31647d93b52" containerName="oc" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.160293 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c53935-88fc-4ab6-8fc4-e31647d93b52" containerName="oc" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.161049 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.163708 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.164065 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.163918 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.177741 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-m5ll8"] Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.234696 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2s7r\" (UniqueName: \"kubernetes.io/projected/701c1cd8-a07f-4d9d-ae29-77db3778220c-kube-api-access-s2s7r\") pod \"auto-csr-approver-29566660-m5ll8\" (UID: \"701c1cd8-a07f-4d9d-ae29-77db3778220c\") " pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.335719 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2s7r\" (UniqueName: \"kubernetes.io/projected/701c1cd8-a07f-4d9d-ae29-77db3778220c-kube-api-access-s2s7r\") pod \"auto-csr-approver-29566660-m5ll8\" (UID: \"701c1cd8-a07f-4d9d-ae29-77db3778220c\") " pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.359868 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2s7r\" (UniqueName: \"kubernetes.io/projected/701c1cd8-a07f-4d9d-ae29-77db3778220c-kube-api-access-s2s7r\") pod \"auto-csr-approver-29566660-m5ll8\" (UID: \"701c1cd8-a07f-4d9d-ae29-77db3778220c\") " pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.487338 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:00 crc kubenswrapper[4958]: I0320 09:40:00.733849 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-m5ll8"] Mar 20 09:40:01 crc kubenswrapper[4958]: I0320 09:40:01.181263 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" event={"ID":"701c1cd8-a07f-4d9d-ae29-77db3778220c","Type":"ContainerStarted","Data":"395e385883ed5e780d444c9a8a88cd0c9ea61fd6783e4c617999a720bdb7679c"} Mar 20 09:40:02 crc kubenswrapper[4958]: I0320 09:40:02.206110 4958 generic.go:334] "Generic (PLEG): container finished" podID="701c1cd8-a07f-4d9d-ae29-77db3778220c" containerID="33b38968d888af72ba7caea84c357f8e0aee6aedef6b974d448ac96c7eaa2815" exitCode=0 Mar 20 09:40:02 crc kubenswrapper[4958]: I0320 09:40:02.206390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" event={"ID":"701c1cd8-a07f-4d9d-ae29-77db3778220c","Type":"ContainerDied","Data":"33b38968d888af72ba7caea84c357f8e0aee6aedef6b974d448ac96c7eaa2815"} Mar 20 09:40:03 crc kubenswrapper[4958]: I0320 09:40:03.593040 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:03 crc kubenswrapper[4958]: I0320 09:40:03.691266 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2s7r\" (UniqueName: \"kubernetes.io/projected/701c1cd8-a07f-4d9d-ae29-77db3778220c-kube-api-access-s2s7r\") pod \"701c1cd8-a07f-4d9d-ae29-77db3778220c\" (UID: \"701c1cd8-a07f-4d9d-ae29-77db3778220c\") " Mar 20 09:40:03 crc kubenswrapper[4958]: I0320 09:40:03.698821 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/701c1cd8-a07f-4d9d-ae29-77db3778220c-kube-api-access-s2s7r" (OuterVolumeSpecName: "kube-api-access-s2s7r") pod "701c1cd8-a07f-4d9d-ae29-77db3778220c" (UID: "701c1cd8-a07f-4d9d-ae29-77db3778220c"). InnerVolumeSpecName "kube-api-access-s2s7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:40:03 crc kubenswrapper[4958]: I0320 09:40:03.792946 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2s7r\" (UniqueName: \"kubernetes.io/projected/701c1cd8-a07f-4d9d-ae29-77db3778220c-kube-api-access-s2s7r\") on node \"crc\" DevicePath \"\"" Mar 20 09:40:04 crc kubenswrapper[4958]: I0320 09:40:04.222071 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" event={"ID":"701c1cd8-a07f-4d9d-ae29-77db3778220c","Type":"ContainerDied","Data":"395e385883ed5e780d444c9a8a88cd0c9ea61fd6783e4c617999a720bdb7679c"} Mar 20 09:40:04 crc kubenswrapper[4958]: I0320 09:40:04.222117 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="395e385883ed5e780d444c9a8a88cd0c9ea61fd6783e4c617999a720bdb7679c" Mar 20 09:40:04 crc kubenswrapper[4958]: I0320 09:40:04.222150 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566660-m5ll8" Mar 20 09:40:04 crc kubenswrapper[4958]: I0320 09:40:04.683455 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-mk4r8"] Mar 20 09:40:04 crc kubenswrapper[4958]: I0320 09:40:04.691518 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566654-mk4r8"] Mar 20 09:40:06 crc kubenswrapper[4958]: I0320 09:40:06.443546 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fa4e46e4-3330-4be2-b41f-9d39ae7b85e2" path="/var/lib/kubelet/pods/fa4e46e4-3330-4be2-b41f-9d39ae7b85e2/volumes" Mar 20 09:40:10 crc kubenswrapper[4958]: I0320 09:40:10.443824 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:40:10 crc kubenswrapper[4958]: E0320 09:40:10.444359 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:40:23 crc kubenswrapper[4958]: I0320 09:40:23.435054 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:40:23 crc kubenswrapper[4958]: E0320 09:40:23.435973 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:40:37 crc kubenswrapper[4958]: I0320 09:40:37.435124 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:40:37 crc kubenswrapper[4958]: E0320 09:40:37.435979 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:40:44 crc kubenswrapper[4958]: I0320 09:40:44.494518 4958 scope.go:117] "RemoveContainer" containerID="c9c6aa7fffeae132d26504f29963273335587e6dd251a9222347992e72a7e6df" Mar 20 09:40:48 crc kubenswrapper[4958]: I0320 09:40:48.435803 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:40:48 crc kubenswrapper[4958]: E0320 09:40:48.436346 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:41:02 crc kubenswrapper[4958]: I0320 09:41:02.435815 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:41:02 crc kubenswrapper[4958]: E0320 09:41:02.436982 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:41:14 crc kubenswrapper[4958]: I0320 09:41:14.436012 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:41:14 crc kubenswrapper[4958]: E0320 09:41:14.436903 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:41:25 crc kubenswrapper[4958]: I0320 09:41:25.434544 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:41:25 crc kubenswrapper[4958]: E0320 09:41:25.435407 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:41:37 crc kubenswrapper[4958]: I0320 09:41:37.435518 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:41:37 crc kubenswrapper[4958]: E0320 09:41:37.436306 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:41:48 crc kubenswrapper[4958]: I0320 09:41:48.435267 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:41:48 crc kubenswrapper[4958]: E0320 09:41:48.436228 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.156895 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566662-ckr2r"] Mar 20 09:42:00 crc kubenswrapper[4958]: E0320 09:42:00.158075 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="701c1cd8-a07f-4d9d-ae29-77db3778220c" containerName="oc" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.158090 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="701c1cd8-a07f-4d9d-ae29-77db3778220c" containerName="oc" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.158252 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="701c1cd8-a07f-4d9d-ae29-77db3778220c" containerName="oc" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.158786 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.161313 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.161503 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.161922 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.169318 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-ckr2r"] Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.226915 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vpgv\" (UniqueName: \"kubernetes.io/projected/e5a71229-c9d3-4e26-b3ac-e6baa545d204-kube-api-access-8vpgv\") pod \"auto-csr-approver-29566662-ckr2r\" (UID: \"e5a71229-c9d3-4e26-b3ac-e6baa545d204\") " pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.328415 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vpgv\" (UniqueName: \"kubernetes.io/projected/e5a71229-c9d3-4e26-b3ac-e6baa545d204-kube-api-access-8vpgv\") pod \"auto-csr-approver-29566662-ckr2r\" (UID: \"e5a71229-c9d3-4e26-b3ac-e6baa545d204\") " pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.354244 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vpgv\" (UniqueName: \"kubernetes.io/projected/e5a71229-c9d3-4e26-b3ac-e6baa545d204-kube-api-access-8vpgv\") pod \"auto-csr-approver-29566662-ckr2r\" (UID: \"e5a71229-c9d3-4e26-b3ac-e6baa545d204\") " pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.439920 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:42:00 crc kubenswrapper[4958]: E0320 09:42:00.440280 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.480009 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:00 crc kubenswrapper[4958]: I0320 09:42:00.988924 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-ckr2r"] Mar 20 09:42:01 crc kubenswrapper[4958]: I0320 09:42:01.003806 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:42:01 crc kubenswrapper[4958]: I0320 09:42:01.153528 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" event={"ID":"e5a71229-c9d3-4e26-b3ac-e6baa545d204","Type":"ContainerStarted","Data":"7b007be6889d9f58ae7bc627e37a9f30ecfe041d5098ea70cc0e401b8398e544"} Mar 20 09:42:03 crc kubenswrapper[4958]: I0320 09:42:03.175784 4958 generic.go:334] "Generic (PLEG): container finished" podID="e5a71229-c9d3-4e26-b3ac-e6baa545d204" containerID="bc4d2fb86b070afd148f7d2d96c276ad5cb11b98f783998f677ee32237c8205f" exitCode=0 Mar 20 09:42:03 crc kubenswrapper[4958]: I0320 09:42:03.176390 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" event={"ID":"e5a71229-c9d3-4e26-b3ac-e6baa545d204","Type":"ContainerDied","Data":"bc4d2fb86b070afd148f7d2d96c276ad5cb11b98f783998f677ee32237c8205f"} Mar 20 09:42:04 crc kubenswrapper[4958]: I0320 09:42:04.506025 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:04 crc kubenswrapper[4958]: I0320 09:42:04.595050 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8vpgv\" (UniqueName: \"kubernetes.io/projected/e5a71229-c9d3-4e26-b3ac-e6baa545d204-kube-api-access-8vpgv\") pod \"e5a71229-c9d3-4e26-b3ac-e6baa545d204\" (UID: \"e5a71229-c9d3-4e26-b3ac-e6baa545d204\") " Mar 20 09:42:04 crc kubenswrapper[4958]: I0320 09:42:04.603799 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e5a71229-c9d3-4e26-b3ac-e6baa545d204-kube-api-access-8vpgv" (OuterVolumeSpecName: "kube-api-access-8vpgv") pod "e5a71229-c9d3-4e26-b3ac-e6baa545d204" (UID: "e5a71229-c9d3-4e26-b3ac-e6baa545d204"). InnerVolumeSpecName "kube-api-access-8vpgv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:42:04 crc kubenswrapper[4958]: I0320 09:42:04.696484 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8vpgv\" (UniqueName: \"kubernetes.io/projected/e5a71229-c9d3-4e26-b3ac-e6baa545d204-kube-api-access-8vpgv\") on node \"crc\" DevicePath \"\"" Mar 20 09:42:05 crc kubenswrapper[4958]: I0320 09:42:05.195957 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" Mar 20 09:42:05 crc kubenswrapper[4958]: I0320 09:42:05.195971 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566662-ckr2r" event={"ID":"e5a71229-c9d3-4e26-b3ac-e6baa545d204","Type":"ContainerDied","Data":"7b007be6889d9f58ae7bc627e37a9f30ecfe041d5098ea70cc0e401b8398e544"} Mar 20 09:42:05 crc kubenswrapper[4958]: I0320 09:42:05.196051 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b007be6889d9f58ae7bc627e37a9f30ecfe041d5098ea70cc0e401b8398e544" Mar 20 09:42:05 crc kubenswrapper[4958]: I0320 09:42:05.603998 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-8jtqk"] Mar 20 09:42:05 crc kubenswrapper[4958]: I0320 09:42:05.613231 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566656-8jtqk"] Mar 20 09:42:06 crc kubenswrapper[4958]: I0320 09:42:06.451788 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c085a2ef-2dbe-4f49-b250-e39107f4ed13" path="/var/lib/kubelet/pods/c085a2ef-2dbe-4f49-b250-e39107f4ed13/volumes" Mar 20 09:42:15 crc kubenswrapper[4958]: I0320 09:42:15.434411 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:42:15 crc kubenswrapper[4958]: E0320 09:42:15.435515 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:42:28 crc kubenswrapper[4958]: I0320 09:42:28.434986 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:42:29 crc kubenswrapper[4958]: I0320 09:42:29.388454 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"af7b0747b671c01f88ecefb15ee2c1afeec2577d02d578b1825d0e4a93acaa7b"} Mar 20 09:42:44 crc kubenswrapper[4958]: I0320 09:42:44.593914 4958 scope.go:117] "RemoveContainer" containerID="5869a3070d2a9937efcb0e005cf6bb00c0c3d2a0f58e9ae28268c49d427ef42a" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.153262 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566664-5r9p5"] Mar 20 09:44:00 crc kubenswrapper[4958]: E0320 09:44:00.154461 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e5a71229-c9d3-4e26-b3ac-e6baa545d204" containerName="oc" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.154482 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="e5a71229-c9d3-4e26-b3ac-e6baa545d204" containerName="oc" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.155454 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="e5a71229-c9d3-4e26-b3ac-e6baa545d204" containerName="oc" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.156436 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.156953 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gv8f6\" (UniqueName: \"kubernetes.io/projected/7a9177dd-965c-4329-9672-7486c11a89a7-kube-api-access-gv8f6\") pod \"auto-csr-approver-29566664-5r9p5\" (UID: \"7a9177dd-965c-4329-9672-7486c11a89a7\") " pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.160793 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.161156 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.166176 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.170877 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-5r9p5"] Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.258370 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gv8f6\" (UniqueName: \"kubernetes.io/projected/7a9177dd-965c-4329-9672-7486c11a89a7-kube-api-access-gv8f6\") pod \"auto-csr-approver-29566664-5r9p5\" (UID: \"7a9177dd-965c-4329-9672-7486c11a89a7\") " pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.288271 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gv8f6\" (UniqueName: \"kubernetes.io/projected/7a9177dd-965c-4329-9672-7486c11a89a7-kube-api-access-gv8f6\") pod \"auto-csr-approver-29566664-5r9p5\" (UID: \"7a9177dd-965c-4329-9672-7486c11a89a7\") " pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.485798 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:00 crc kubenswrapper[4958]: I0320 09:44:00.919395 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-5r9p5"] Mar 20 09:44:01 crc kubenswrapper[4958]: I0320 09:44:01.192332 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" event={"ID":"7a9177dd-965c-4329-9672-7486c11a89a7","Type":"ContainerStarted","Data":"7c1ab60d11f67b40a105652e12fb5086c83aacfab5c5c3493b427c6a0d2a04d7"} Mar 20 09:44:02 crc kubenswrapper[4958]: I0320 09:44:02.205964 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" event={"ID":"7a9177dd-965c-4329-9672-7486c11a89a7","Type":"ContainerStarted","Data":"bbcf248690df1cc5c3c2e0924d8ddfee8e950dd1714ec7c5211445748b3ed157"} Mar 20 09:44:03 crc kubenswrapper[4958]: I0320 09:44:03.221081 4958 generic.go:334] "Generic (PLEG): container finished" podID="7a9177dd-965c-4329-9672-7486c11a89a7" containerID="bbcf248690df1cc5c3c2e0924d8ddfee8e950dd1714ec7c5211445748b3ed157" exitCode=0 Mar 20 09:44:03 crc kubenswrapper[4958]: I0320 09:44:03.221187 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" event={"ID":"7a9177dd-965c-4329-9672-7486c11a89a7","Type":"ContainerDied","Data":"bbcf248690df1cc5c3c2e0924d8ddfee8e950dd1714ec7c5211445748b3ed157"} Mar 20 09:44:03 crc kubenswrapper[4958]: I0320 09:44:03.545524 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:03 crc kubenswrapper[4958]: I0320 09:44:03.631116 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gv8f6\" (UniqueName: \"kubernetes.io/projected/7a9177dd-965c-4329-9672-7486c11a89a7-kube-api-access-gv8f6\") pod \"7a9177dd-965c-4329-9672-7486c11a89a7\" (UID: \"7a9177dd-965c-4329-9672-7486c11a89a7\") " Mar 20 09:44:03 crc kubenswrapper[4958]: I0320 09:44:03.638349 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a9177dd-965c-4329-9672-7486c11a89a7-kube-api-access-gv8f6" (OuterVolumeSpecName: "kube-api-access-gv8f6") pod "7a9177dd-965c-4329-9672-7486c11a89a7" (UID: "7a9177dd-965c-4329-9672-7486c11a89a7"). InnerVolumeSpecName "kube-api-access-gv8f6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:44:03 crc kubenswrapper[4958]: I0320 09:44:03.733043 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gv8f6\" (UniqueName: \"kubernetes.io/projected/7a9177dd-965c-4329-9672-7486c11a89a7-kube-api-access-gv8f6\") on node \"crc\" DevicePath \"\"" Mar 20 09:44:04 crc kubenswrapper[4958]: I0320 09:44:04.230909 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" event={"ID":"7a9177dd-965c-4329-9672-7486c11a89a7","Type":"ContainerDied","Data":"7c1ab60d11f67b40a105652e12fb5086c83aacfab5c5c3493b427c6a0d2a04d7"} Mar 20 09:44:04 crc kubenswrapper[4958]: I0320 09:44:04.230958 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c1ab60d11f67b40a105652e12fb5086c83aacfab5c5c3493b427c6a0d2a04d7" Mar 20 09:44:04 crc kubenswrapper[4958]: I0320 09:44:04.231021 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566664-5r9p5" Mar 20 09:44:04 crc kubenswrapper[4958]: I0320 09:44:04.632423 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-qmz92"] Mar 20 09:44:04 crc kubenswrapper[4958]: I0320 09:44:04.639982 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566658-qmz92"] Mar 20 09:44:06 crc kubenswrapper[4958]: I0320 09:44:06.445610 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c53935-88fc-4ab6-8fc4-e31647d93b52" path="/var/lib/kubelet/pods/82c53935-88fc-4ab6-8fc4-e31647d93b52/volumes" Mar 20 09:44:44 crc kubenswrapper[4958]: I0320 09:44:44.697058 4958 scope.go:117] "RemoveContainer" containerID="7631be9c8ac8067cfb37530d8bc4f10141107bc46b6b9bbc833499b2c57b466d" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.445879 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vdrkd"] Mar 20 09:44:55 crc kubenswrapper[4958]: E0320 09:44:55.447047 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a9177dd-965c-4329-9672-7486c11a89a7" containerName="oc" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.447063 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a9177dd-965c-4329-9672-7486c11a89a7" containerName="oc" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.447204 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a9177dd-965c-4329-9672-7486c11a89a7" containerName="oc" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.448411 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.463269 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdrkd"] Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.538870 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tv9lm\" (UniqueName: \"kubernetes.io/projected/8f748d3e-6cb9-4aae-a21b-04201883e2a9-kube-api-access-tv9lm\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.539037 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-catalog-content\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.539125 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-utilities\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.640884 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-catalog-content\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.640945 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-utilities\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.641008 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tv9lm\" (UniqueName: \"kubernetes.io/projected/8f748d3e-6cb9-4aae-a21b-04201883e2a9-kube-api-access-tv9lm\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.641833 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-catalog-content\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.642205 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-utilities\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.666845 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tv9lm\" (UniqueName: \"kubernetes.io/projected/8f748d3e-6cb9-4aae-a21b-04201883e2a9-kube-api-access-tv9lm\") pod \"redhat-operators-vdrkd\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:55 crc kubenswrapper[4958]: I0320 09:44:55.770239 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:44:56 crc kubenswrapper[4958]: I0320 09:44:56.251644 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vdrkd"] Mar 20 09:44:56 crc kubenswrapper[4958]: I0320 09:44:56.521446 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:44:56 crc kubenswrapper[4958]: I0320 09:44:56.521508 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:44:56 crc kubenswrapper[4958]: I0320 09:44:56.897179 4958 generic.go:334] "Generic (PLEG): container finished" podID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerID="14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d" exitCode=0 Mar 20 09:44:56 crc kubenswrapper[4958]: I0320 09:44:56.897267 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerDied","Data":"14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d"} Mar 20 09:44:56 crc kubenswrapper[4958]: I0320 09:44:56.897299 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerStarted","Data":"61427ec173204256d4c0b09c974951e6dd5badfe5473d253b30511b897d613b2"} Mar 20 09:44:58 crc kubenswrapper[4958]: I0320 09:44:58.914190 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerStarted","Data":"21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb"} Mar 20 09:44:59 crc kubenswrapper[4958]: I0320 09:44:59.927142 4958 generic.go:334] "Generic (PLEG): container finished" podID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerID="21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb" exitCode=0 Mar 20 09:44:59 crc kubenswrapper[4958]: I0320 09:44:59.927217 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerDied","Data":"21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb"} Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.153635 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls"] Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.156215 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.159804 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.160222 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.165292 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls"] Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.219539 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vkt\" (UniqueName: \"kubernetes.io/projected/cc91991e-151b-4b2a-92f0-68df1717e6f1-kube-api-access-k4vkt\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.219687 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc91991e-151b-4b2a-92f0-68df1717e6f1-secret-volume\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.219741 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc91991e-151b-4b2a-92f0-68df1717e6f1-config-volume\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.321967 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc91991e-151b-4b2a-92f0-68df1717e6f1-secret-volume\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.322057 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc91991e-151b-4b2a-92f0-68df1717e6f1-config-volume\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.322225 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vkt\" (UniqueName: \"kubernetes.io/projected/cc91991e-151b-4b2a-92f0-68df1717e6f1-kube-api-access-k4vkt\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.323844 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc91991e-151b-4b2a-92f0-68df1717e6f1-config-volume\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.329830 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc91991e-151b-4b2a-92f0-68df1717e6f1-secret-volume\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.344752 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vkt\" (UniqueName: \"kubernetes.io/projected/cc91991e-151b-4b2a-92f0-68df1717e6f1-kube-api-access-k4vkt\") pod \"collect-profiles-29566665-stkls\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.479965 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.823801 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls"] Mar 20 09:45:00 crc kubenswrapper[4958]: W0320 09:45:00.828016 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcc91991e_151b_4b2a_92f0_68df1717e6f1.slice/crio-b7c15ab148e24fd977e50c09f09bea88f720f04234a84d1ab4167946dfc43adf WatchSource:0}: Error finding container b7c15ab148e24fd977e50c09f09bea88f720f04234a84d1ab4167946dfc43adf: Status 404 returned error can't find the container with id b7c15ab148e24fd977e50c09f09bea88f720f04234a84d1ab4167946dfc43adf Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.944865 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" event={"ID":"cc91991e-151b-4b2a-92f0-68df1717e6f1","Type":"ContainerStarted","Data":"b7c15ab148e24fd977e50c09f09bea88f720f04234a84d1ab4167946dfc43adf"} Mar 20 09:45:00 crc kubenswrapper[4958]: I0320 09:45:00.947745 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerStarted","Data":"1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9"} Mar 20 09:45:01 crc kubenswrapper[4958]: I0320 09:45:01.956619 4958 generic.go:334] "Generic (PLEG): container finished" podID="cc91991e-151b-4b2a-92f0-68df1717e6f1" containerID="8341f15925677067e19305db6e22d8d8c537f021e01c45552a9c5cf5748d0146" exitCode=0 Mar 20 09:45:01 crc kubenswrapper[4958]: I0320 09:45:01.956882 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" event={"ID":"cc91991e-151b-4b2a-92f0-68df1717e6f1","Type":"ContainerDied","Data":"8341f15925677067e19305db6e22d8d8c537f021e01c45552a9c5cf5748d0146"} Mar 20 09:45:01 crc kubenswrapper[4958]: I0320 09:45:01.976299 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vdrkd" podStartSLOduration=3.5110718480000003 podStartE2EDuration="6.976271471s" podCreationTimestamp="2026-03-20 09:44:55 +0000 UTC" firstStartedPulling="2026-03-20 09:44:56.899790601 +0000 UTC m=+2717.221806559" lastFinishedPulling="2026-03-20 09:45:00.364990224 +0000 UTC m=+2720.687006182" observedRunningTime="2026-03-20 09:45:00.98012961 +0000 UTC m=+2721.302145578" watchObservedRunningTime="2026-03-20 09:45:01.976271471 +0000 UTC m=+2722.298287429" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.494616 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.637722 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4vkt\" (UniqueName: \"kubernetes.io/projected/cc91991e-151b-4b2a-92f0-68df1717e6f1-kube-api-access-k4vkt\") pod \"cc91991e-151b-4b2a-92f0-68df1717e6f1\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.637913 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc91991e-151b-4b2a-92f0-68df1717e6f1-config-volume\") pod \"cc91991e-151b-4b2a-92f0-68df1717e6f1\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.637956 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc91991e-151b-4b2a-92f0-68df1717e6f1-secret-volume\") pod \"cc91991e-151b-4b2a-92f0-68df1717e6f1\" (UID: \"cc91991e-151b-4b2a-92f0-68df1717e6f1\") " Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.639142 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cc91991e-151b-4b2a-92f0-68df1717e6f1-config-volume" (OuterVolumeSpecName: "config-volume") pod "cc91991e-151b-4b2a-92f0-68df1717e6f1" (UID: "cc91991e-151b-4b2a-92f0-68df1717e6f1"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.645110 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cc91991e-151b-4b2a-92f0-68df1717e6f1-kube-api-access-k4vkt" (OuterVolumeSpecName: "kube-api-access-k4vkt") pod "cc91991e-151b-4b2a-92f0-68df1717e6f1" (UID: "cc91991e-151b-4b2a-92f0-68df1717e6f1"). InnerVolumeSpecName "kube-api-access-k4vkt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.645256 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cc91991e-151b-4b2a-92f0-68df1717e6f1-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "cc91991e-151b-4b2a-92f0-68df1717e6f1" (UID: "cc91991e-151b-4b2a-92f0-68df1717e6f1"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.740580 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4vkt\" (UniqueName: \"kubernetes.io/projected/cc91991e-151b-4b2a-92f0-68df1717e6f1-kube-api-access-k4vkt\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.740664 4958 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/cc91991e-151b-4b2a-92f0-68df1717e6f1-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.740679 4958 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/cc91991e-151b-4b2a-92f0-68df1717e6f1-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.975252 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" event={"ID":"cc91991e-151b-4b2a-92f0-68df1717e6f1","Type":"ContainerDied","Data":"b7c15ab148e24fd977e50c09f09bea88f720f04234a84d1ab4167946dfc43adf"} Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.975686 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c15ab148e24fd977e50c09f09bea88f720f04234a84d1ab4167946dfc43adf" Mar 20 09:45:03 crc kubenswrapper[4958]: I0320 09:45:03.975865 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566665-stkls" Mar 20 09:45:04 crc kubenswrapper[4958]: I0320 09:45:04.575426 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw"] Mar 20 09:45:04 crc kubenswrapper[4958]: I0320 09:45:04.581160 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566620-nzxmw"] Mar 20 09:45:05 crc kubenswrapper[4958]: I0320 09:45:05.770729 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:45:05 crc kubenswrapper[4958]: I0320 09:45:05.771456 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:45:06 crc kubenswrapper[4958]: I0320 09:45:06.450740 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5434e504-53f0-41f5-96bc-1981e69b15ac" path="/var/lib/kubelet/pods/5434e504-53f0-41f5-96bc-1981e69b15ac/volumes" Mar 20 09:45:06 crc kubenswrapper[4958]: I0320 09:45:06.816134 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-vdrkd" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="registry-server" probeResult="failure" output=< Mar 20 09:45:06 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Mar 20 09:45:06 crc kubenswrapper[4958]: > Mar 20 09:45:15 crc kubenswrapper[4958]: I0320 09:45:15.827122 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:45:15 crc kubenswrapper[4958]: I0320 09:45:15.881745 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:45:16 crc kubenswrapper[4958]: I0320 09:45:16.078205 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vdrkd"] Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.083319 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vdrkd" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="registry-server" containerID="cri-o://1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9" gracePeriod=2 Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.491496 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.588020 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tv9lm\" (UniqueName: \"kubernetes.io/projected/8f748d3e-6cb9-4aae-a21b-04201883e2a9-kube-api-access-tv9lm\") pod \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.588475 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-utilities\") pod \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.588520 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-catalog-content\") pod \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\" (UID: \"8f748d3e-6cb9-4aae-a21b-04201883e2a9\") " Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.589659 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-utilities" (OuterVolumeSpecName: "utilities") pod "8f748d3e-6cb9-4aae-a21b-04201883e2a9" (UID: "8f748d3e-6cb9-4aae-a21b-04201883e2a9"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.594039 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f748d3e-6cb9-4aae-a21b-04201883e2a9-kube-api-access-tv9lm" (OuterVolumeSpecName: "kube-api-access-tv9lm") pod "8f748d3e-6cb9-4aae-a21b-04201883e2a9" (UID: "8f748d3e-6cb9-4aae-a21b-04201883e2a9"). InnerVolumeSpecName "kube-api-access-tv9lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.691206 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.691260 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tv9lm\" (UniqueName: \"kubernetes.io/projected/8f748d3e-6cb9-4aae-a21b-04201883e2a9-kube-api-access-tv9lm\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.748802 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8f748d3e-6cb9-4aae-a21b-04201883e2a9" (UID: "8f748d3e-6cb9-4aae-a21b-04201883e2a9"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:45:17 crc kubenswrapper[4958]: I0320 09:45:17.793196 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8f748d3e-6cb9-4aae-a21b-04201883e2a9-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.093687 4958 generic.go:334] "Generic (PLEG): container finished" podID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerID="1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9" exitCode=0 Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.093748 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerDied","Data":"1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9"} Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.093791 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vdrkd" event={"ID":"8f748d3e-6cb9-4aae-a21b-04201883e2a9","Type":"ContainerDied","Data":"61427ec173204256d4c0b09c974951e6dd5badfe5473d253b30511b897d613b2"} Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.093819 4958 scope.go:117] "RemoveContainer" containerID="1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.094008 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vdrkd" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.120279 4958 scope.go:117] "RemoveContainer" containerID="21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.133645 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vdrkd"] Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.139686 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vdrkd"] Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.150373 4958 scope.go:117] "RemoveContainer" containerID="14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.185591 4958 scope.go:117] "RemoveContainer" containerID="1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9" Mar 20 09:45:18 crc kubenswrapper[4958]: E0320 09:45:18.186310 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9\": container with ID starting with 1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9 not found: ID does not exist" containerID="1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.186357 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9"} err="failed to get container status \"1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9\": rpc error: code = NotFound desc = could not find container \"1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9\": container with ID starting with 1a928f15d16de1e700f1a738e7cad843ad0e1198d4b980dca949fe303b1d63d9 not found: ID does not exist" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.186395 4958 scope.go:117] "RemoveContainer" containerID="21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb" Mar 20 09:45:18 crc kubenswrapper[4958]: E0320 09:45:18.186871 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb\": container with ID starting with 21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb not found: ID does not exist" containerID="21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.186928 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb"} err="failed to get container status \"21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb\": rpc error: code = NotFound desc = could not find container \"21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb\": container with ID starting with 21c63ec0e8c352ebc0044a8f2faa9a189bb1b24d9ea1df7ce129771a4cdedfcb not found: ID does not exist" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.186965 4958 scope.go:117] "RemoveContainer" containerID="14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d" Mar 20 09:45:18 crc kubenswrapper[4958]: E0320 09:45:18.187474 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d\": container with ID starting with 14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d not found: ID does not exist" containerID="14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.187532 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d"} err="failed to get container status \"14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d\": rpc error: code = NotFound desc = could not find container \"14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d\": container with ID starting with 14ba63c844c1d7860705490be84685d28c28b482ba2901bc867e3a7ef8786e2d not found: ID does not exist" Mar 20 09:45:18 crc kubenswrapper[4958]: I0320 09:45:18.443879 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" path="/var/lib/kubelet/pods/8f748d3e-6cb9-4aae-a21b-04201883e2a9/volumes" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.476376 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-txshd"] Mar 20 09:45:24 crc kubenswrapper[4958]: E0320 09:45:24.477212 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cc91991e-151b-4b2a-92f0-68df1717e6f1" containerName="collect-profiles" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.477233 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="cc91991e-151b-4b2a-92f0-68df1717e6f1" containerName="collect-profiles" Mar 20 09:45:24 crc kubenswrapper[4958]: E0320 09:45:24.477247 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="registry-server" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.477254 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="registry-server" Mar 20 09:45:24 crc kubenswrapper[4958]: E0320 09:45:24.477278 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="extract-utilities" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.477284 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="extract-utilities" Mar 20 09:45:24 crc kubenswrapper[4958]: E0320 09:45:24.477294 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="extract-content" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.477300 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="extract-content" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.477545 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="cc91991e-151b-4b2a-92f0-68df1717e6f1" containerName="collect-profiles" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.477559 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="8f748d3e-6cb9-4aae-a21b-04201883e2a9" containerName="registry-server" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.478967 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.490777 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txshd"] Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.598896 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-utilities\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.598964 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-catalog-content\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.599791 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5brs\" (UniqueName: \"kubernetes.io/projected/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-kube-api-access-r5brs\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.701890 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-utilities\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.701953 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-catalog-content\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.702047 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5brs\" (UniqueName: \"kubernetes.io/projected/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-kube-api-access-r5brs\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.702574 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-utilities\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.702663 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-catalog-content\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.731569 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5brs\" (UniqueName: \"kubernetes.io/projected/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-kube-api-access-r5brs\") pod \"redhat-marketplace-txshd\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:24 crc kubenswrapper[4958]: I0320 09:45:24.802640 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:25 crc kubenswrapper[4958]: I0320 09:45:25.294313 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-txshd"] Mar 20 09:45:25 crc kubenswrapper[4958]: W0320 09:45:25.298271 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2b9fdf6a_f51f_4213_8daf_83ec01908cc4.slice/crio-4e839fa6a7bbc63adc0ec1e25673386aaa43947fe5a83c7e8a810968a4f4ce74 WatchSource:0}: Error finding container 4e839fa6a7bbc63adc0ec1e25673386aaa43947fe5a83c7e8a810968a4f4ce74: Status 404 returned error can't find the container with id 4e839fa6a7bbc63adc0ec1e25673386aaa43947fe5a83c7e8a810968a4f4ce74 Mar 20 09:45:26 crc kubenswrapper[4958]: I0320 09:45:26.172501 4958 generic.go:334] "Generic (PLEG): container finished" podID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerID="df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257" exitCode=0 Mar 20 09:45:26 crc kubenswrapper[4958]: I0320 09:45:26.172576 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txshd" event={"ID":"2b9fdf6a-f51f-4213-8daf-83ec01908cc4","Type":"ContainerDied","Data":"df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257"} Mar 20 09:45:26 crc kubenswrapper[4958]: I0320 09:45:26.172664 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txshd" event={"ID":"2b9fdf6a-f51f-4213-8daf-83ec01908cc4","Type":"ContainerStarted","Data":"4e839fa6a7bbc63adc0ec1e25673386aaa43947fe5a83c7e8a810968a4f4ce74"} Mar 20 09:45:26 crc kubenswrapper[4958]: I0320 09:45:26.520741 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:45:26 crc kubenswrapper[4958]: I0320 09:45:26.521059 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:45:27 crc kubenswrapper[4958]: I0320 09:45:27.185188 4958 generic.go:334] "Generic (PLEG): container finished" podID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerID="e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f" exitCode=0 Mar 20 09:45:27 crc kubenswrapper[4958]: I0320 09:45:27.185249 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txshd" event={"ID":"2b9fdf6a-f51f-4213-8daf-83ec01908cc4","Type":"ContainerDied","Data":"e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f"} Mar 20 09:45:28 crc kubenswrapper[4958]: I0320 09:45:28.196675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txshd" event={"ID":"2b9fdf6a-f51f-4213-8daf-83ec01908cc4","Type":"ContainerStarted","Data":"3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c"} Mar 20 09:45:28 crc kubenswrapper[4958]: I0320 09:45:28.221382 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-txshd" podStartSLOduration=2.761708025 podStartE2EDuration="4.221356081s" podCreationTimestamp="2026-03-20 09:45:24 +0000 UTC" firstStartedPulling="2026-03-20 09:45:26.175068432 +0000 UTC m=+2746.497084390" lastFinishedPulling="2026-03-20 09:45:27.634716488 +0000 UTC m=+2747.956732446" observedRunningTime="2026-03-20 09:45:28.214915563 +0000 UTC m=+2748.536931521" watchObservedRunningTime="2026-03-20 09:45:28.221356081 +0000 UTC m=+2748.543372039" Mar 20 09:45:34 crc kubenswrapper[4958]: I0320 09:45:34.804387 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:34 crc kubenswrapper[4958]: I0320 09:45:34.805342 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:34 crc kubenswrapper[4958]: I0320 09:45:34.859790 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:35 crc kubenswrapper[4958]: I0320 09:45:35.303848 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:35 crc kubenswrapper[4958]: I0320 09:45:35.357574 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txshd"] Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.267148 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-txshd" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="registry-server" containerID="cri-o://3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c" gracePeriod=2 Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.789833 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.931204 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-catalog-content\") pod \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.931398 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-utilities\") pod \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.931563 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5brs\" (UniqueName: \"kubernetes.io/projected/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-kube-api-access-r5brs\") pod \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\" (UID: \"2b9fdf6a-f51f-4213-8daf-83ec01908cc4\") " Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.932396 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-utilities" (OuterVolumeSpecName: "utilities") pod "2b9fdf6a-f51f-4213-8daf-83ec01908cc4" (UID: "2b9fdf6a-f51f-4213-8daf-83ec01908cc4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.932917 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.939191 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-kube-api-access-r5brs" (OuterVolumeSpecName: "kube-api-access-r5brs") pod "2b9fdf6a-f51f-4213-8daf-83ec01908cc4" (UID: "2b9fdf6a-f51f-4213-8daf-83ec01908cc4"). InnerVolumeSpecName "kube-api-access-r5brs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:45:37 crc kubenswrapper[4958]: I0320 09:45:37.969729 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "2b9fdf6a-f51f-4213-8daf-83ec01908cc4" (UID: "2b9fdf6a-f51f-4213-8daf-83ec01908cc4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.035018 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5brs\" (UniqueName: \"kubernetes.io/projected/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-kube-api-access-r5brs\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.035074 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/2b9fdf6a-f51f-4213-8daf-83ec01908cc4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.276335 4958 generic.go:334] "Generic (PLEG): container finished" podID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerID="3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c" exitCode=0 Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.276433 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-txshd" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.276456 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txshd" event={"ID":"2b9fdf6a-f51f-4213-8daf-83ec01908cc4","Type":"ContainerDied","Data":"3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c"} Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.276947 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-txshd" event={"ID":"2b9fdf6a-f51f-4213-8daf-83ec01908cc4","Type":"ContainerDied","Data":"4e839fa6a7bbc63adc0ec1e25673386aaa43947fe5a83c7e8a810968a4f4ce74"} Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.276978 4958 scope.go:117] "RemoveContainer" containerID="3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.300131 4958 scope.go:117] "RemoveContainer" containerID="e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.321872 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-txshd"] Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.328192 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-txshd"] Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.337238 4958 scope.go:117] "RemoveContainer" containerID="df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.357548 4958 scope.go:117] "RemoveContainer" containerID="3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c" Mar 20 09:45:38 crc kubenswrapper[4958]: E0320 09:45:38.358091 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c\": container with ID starting with 3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c not found: ID does not exist" containerID="3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.358142 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c"} err="failed to get container status \"3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c\": rpc error: code = NotFound desc = could not find container \"3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c\": container with ID starting with 3792f548518f444f8685c05583afa891686776c50088c48e87de82d48c7cef3c not found: ID does not exist" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.358174 4958 scope.go:117] "RemoveContainer" containerID="e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f" Mar 20 09:45:38 crc kubenswrapper[4958]: E0320 09:45:38.358522 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f\": container with ID starting with e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f not found: ID does not exist" containerID="e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.358543 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f"} err="failed to get container status \"e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f\": rpc error: code = NotFound desc = could not find container \"e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f\": container with ID starting with e18e4dcb219a75de0e773b6495acc3f288d5e459236cc3cbf2baad818b4b1a7f not found: ID does not exist" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.358559 4958 scope.go:117] "RemoveContainer" containerID="df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257" Mar 20 09:45:38 crc kubenswrapper[4958]: E0320 09:45:38.358878 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257\": container with ID starting with df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257 not found: ID does not exist" containerID="df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.358906 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257"} err="failed to get container status \"df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257\": rpc error: code = NotFound desc = could not find container \"df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257\": container with ID starting with df2e76599878434f08684aff940727d401f4eab9acdd3ba8225a44ee15482257 not found: ID does not exist" Mar 20 09:45:38 crc kubenswrapper[4958]: I0320 09:45:38.445030 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" path="/var/lib/kubelet/pods/2b9fdf6a-f51f-4213-8daf-83ec01908cc4/volumes" Mar 20 09:45:44 crc kubenswrapper[4958]: I0320 09:45:44.778958 4958 scope.go:117] "RemoveContainer" containerID="5fe83ebb49b2b9ed133cdce65b5dd206dba5038eb5a663ef3adecb3ba8944ddd" Mar 20 09:45:56 crc kubenswrapper[4958]: I0320 09:45:56.521249 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:45:56 crc kubenswrapper[4958]: I0320 09:45:56.522045 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:45:56 crc kubenswrapper[4958]: I0320 09:45:56.522121 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:45:56 crc kubenswrapper[4958]: I0320 09:45:56.523159 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"af7b0747b671c01f88ecefb15ee2c1afeec2577d02d578b1825d0e4a93acaa7b"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:45:56 crc kubenswrapper[4958]: I0320 09:45:56.523242 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://af7b0747b671c01f88ecefb15ee2c1afeec2577d02d578b1825d0e4a93acaa7b" gracePeriod=600 Mar 20 09:45:57 crc kubenswrapper[4958]: I0320 09:45:57.438530 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="af7b0747b671c01f88ecefb15ee2c1afeec2577d02d578b1825d0e4a93acaa7b" exitCode=0 Mar 20 09:45:57 crc kubenswrapper[4958]: I0320 09:45:57.438570 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"af7b0747b671c01f88ecefb15ee2c1afeec2577d02d578b1825d0e4a93acaa7b"} Mar 20 09:45:57 crc kubenswrapper[4958]: I0320 09:45:57.439468 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e"} Mar 20 09:45:57 crc kubenswrapper[4958]: I0320 09:45:57.439498 4958 scope.go:117] "RemoveContainer" containerID="81132eafb1730061f8ed0091c9483e3c56701ffddda72b585dea16a5c5b14081" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.154502 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566666-76fjf"] Mar 20 09:46:00 crc kubenswrapper[4958]: E0320 09:46:00.155362 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="extract-content" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.155385 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="extract-content" Mar 20 09:46:00 crc kubenswrapper[4958]: E0320 09:46:00.155416 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="extract-utilities" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.155427 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="extract-utilities" Mar 20 09:46:00 crc kubenswrapper[4958]: E0320 09:46:00.155459 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="registry-server" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.155471 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="registry-server" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.155752 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b9fdf6a-f51f-4213-8daf-83ec01908cc4" containerName="registry-server" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.157222 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.161765 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.162004 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.162702 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.170560 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-76fjf"] Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.307905 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqsj9\" (UniqueName: \"kubernetes.io/projected/c0ffa391-7a3f-40c1-bb17-051efee4cc88-kube-api-access-fqsj9\") pod \"auto-csr-approver-29566666-76fjf\" (UID: \"c0ffa391-7a3f-40c1-bb17-051efee4cc88\") " pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.410263 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fqsj9\" (UniqueName: \"kubernetes.io/projected/c0ffa391-7a3f-40c1-bb17-051efee4cc88-kube-api-access-fqsj9\") pod \"auto-csr-approver-29566666-76fjf\" (UID: \"c0ffa391-7a3f-40c1-bb17-051efee4cc88\") " pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.440104 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqsj9\" (UniqueName: \"kubernetes.io/projected/c0ffa391-7a3f-40c1-bb17-051efee4cc88-kube-api-access-fqsj9\") pod \"auto-csr-approver-29566666-76fjf\" (UID: \"c0ffa391-7a3f-40c1-bb17-051efee4cc88\") " pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.478778 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:00 crc kubenswrapper[4958]: I0320 09:46:00.903151 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-76fjf"] Mar 20 09:46:01 crc kubenswrapper[4958]: I0320 09:46:01.493706 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566666-76fjf" event={"ID":"c0ffa391-7a3f-40c1-bb17-051efee4cc88","Type":"ContainerStarted","Data":"f6b7e0ef6cc9325ac0efdea0b64110b0aebbfb484d1db21bf4627da595b9a2b3"} Mar 20 09:46:02 crc kubenswrapper[4958]: I0320 09:46:02.503769 4958 generic.go:334] "Generic (PLEG): container finished" podID="c0ffa391-7a3f-40c1-bb17-051efee4cc88" containerID="6e17b1aef004db6104a87bbf11f375c90d6f20469fed139d6c371457397d0b6e" exitCode=0 Mar 20 09:46:02 crc kubenswrapper[4958]: I0320 09:46:02.503888 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566666-76fjf" event={"ID":"c0ffa391-7a3f-40c1-bb17-051efee4cc88","Type":"ContainerDied","Data":"6e17b1aef004db6104a87bbf11f375c90d6f20469fed139d6c371457397d0b6e"} Mar 20 09:46:03 crc kubenswrapper[4958]: I0320 09:46:03.810021 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:03 crc kubenswrapper[4958]: I0320 09:46:03.869810 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsj9\" (UniqueName: \"kubernetes.io/projected/c0ffa391-7a3f-40c1-bb17-051efee4cc88-kube-api-access-fqsj9\") pod \"c0ffa391-7a3f-40c1-bb17-051efee4cc88\" (UID: \"c0ffa391-7a3f-40c1-bb17-051efee4cc88\") " Mar 20 09:46:03 crc kubenswrapper[4958]: I0320 09:46:03.878731 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c0ffa391-7a3f-40c1-bb17-051efee4cc88-kube-api-access-fqsj9" (OuterVolumeSpecName: "kube-api-access-fqsj9") pod "c0ffa391-7a3f-40c1-bb17-051efee4cc88" (UID: "c0ffa391-7a3f-40c1-bb17-051efee4cc88"). InnerVolumeSpecName "kube-api-access-fqsj9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:46:03 crc kubenswrapper[4958]: I0320 09:46:03.973304 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsj9\" (UniqueName: \"kubernetes.io/projected/c0ffa391-7a3f-40c1-bb17-051efee4cc88-kube-api-access-fqsj9\") on node \"crc\" DevicePath \"\"" Mar 20 09:46:04 crc kubenswrapper[4958]: I0320 09:46:04.524320 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566666-76fjf" event={"ID":"c0ffa391-7a3f-40c1-bb17-051efee4cc88","Type":"ContainerDied","Data":"f6b7e0ef6cc9325ac0efdea0b64110b0aebbfb484d1db21bf4627da595b9a2b3"} Mar 20 09:46:04 crc kubenswrapper[4958]: I0320 09:46:04.524854 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6b7e0ef6cc9325ac0efdea0b64110b0aebbfb484d1db21bf4627da595b9a2b3" Mar 20 09:46:04 crc kubenswrapper[4958]: I0320 09:46:04.524377 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566666-76fjf" Mar 20 09:46:04 crc kubenswrapper[4958]: I0320 09:46:04.895853 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-m5ll8"] Mar 20 09:46:04 crc kubenswrapper[4958]: I0320 09:46:04.903818 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566660-m5ll8"] Mar 20 09:46:06 crc kubenswrapper[4958]: I0320 09:46:06.447077 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="701c1cd8-a07f-4d9d-ae29-77db3778220c" path="/var/lib/kubelet/pods/701c1cd8-a07f-4d9d-ae29-77db3778220c/volumes" Mar 20 09:46:44 crc kubenswrapper[4958]: I0320 09:46:44.861775 4958 scope.go:117] "RemoveContainer" containerID="33b38968d888af72ba7caea84c357f8e0aee6aedef6b974d448ac96c7eaa2815" Mar 20 09:47:56 crc kubenswrapper[4958]: I0320 09:47:56.521421 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:47:56 crc kubenswrapper[4958]: I0320 09:47:56.522318 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.152612 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566668-w8hn7"] Mar 20 09:48:00 crc kubenswrapper[4958]: E0320 09:48:00.153440 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c0ffa391-7a3f-40c1-bb17-051efee4cc88" containerName="oc" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.153464 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="c0ffa391-7a3f-40c1-bb17-051efee4cc88" containerName="oc" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.153729 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="c0ffa391-7a3f-40c1-bb17-051efee4cc88" containerName="oc" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.154379 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.158407 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.158672 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.159552 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.167410 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-w8hn7"] Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.310275 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c4lbs\" (UniqueName: \"kubernetes.io/projected/9787e5e3-5e75-4049-92ab-df4ef208cb7d-kube-api-access-c4lbs\") pod \"auto-csr-approver-29566668-w8hn7\" (UID: \"9787e5e3-5e75-4049-92ab-df4ef208cb7d\") " pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.412212 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c4lbs\" (UniqueName: \"kubernetes.io/projected/9787e5e3-5e75-4049-92ab-df4ef208cb7d-kube-api-access-c4lbs\") pod \"auto-csr-approver-29566668-w8hn7\" (UID: \"9787e5e3-5e75-4049-92ab-df4ef208cb7d\") " pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.437715 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c4lbs\" (UniqueName: \"kubernetes.io/projected/9787e5e3-5e75-4049-92ab-df4ef208cb7d-kube-api-access-c4lbs\") pod \"auto-csr-approver-29566668-w8hn7\" (UID: \"9787e5e3-5e75-4049-92ab-df4ef208cb7d\") " pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.487855 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.913717 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-w8hn7"] Mar 20 09:48:00 crc kubenswrapper[4958]: I0320 09:48:00.925108 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:48:01 crc kubenswrapper[4958]: I0320 09:48:01.938797 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" event={"ID":"9787e5e3-5e75-4049-92ab-df4ef208cb7d","Type":"ContainerStarted","Data":"31852cb133cff4cfc2cca63eec20c1487037a2b2f37d9304516db5711684bc9e"} Mar 20 09:48:02 crc kubenswrapper[4958]: I0320 09:48:02.948842 4958 generic.go:334] "Generic (PLEG): container finished" podID="9787e5e3-5e75-4049-92ab-df4ef208cb7d" containerID="d6307234328af14b8a00524b3cb057e314de9b89ef00c89bd9ac3ca1bea09642" exitCode=0 Mar 20 09:48:02 crc kubenswrapper[4958]: I0320 09:48:02.948896 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" event={"ID":"9787e5e3-5e75-4049-92ab-df4ef208cb7d","Type":"ContainerDied","Data":"d6307234328af14b8a00524b3cb057e314de9b89ef00c89bd9ac3ca1bea09642"} Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.233264 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.413144 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c4lbs\" (UniqueName: \"kubernetes.io/projected/9787e5e3-5e75-4049-92ab-df4ef208cb7d-kube-api-access-c4lbs\") pod \"9787e5e3-5e75-4049-92ab-df4ef208cb7d\" (UID: \"9787e5e3-5e75-4049-92ab-df4ef208cb7d\") " Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.421144 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9787e5e3-5e75-4049-92ab-df4ef208cb7d-kube-api-access-c4lbs" (OuterVolumeSpecName: "kube-api-access-c4lbs") pod "9787e5e3-5e75-4049-92ab-df4ef208cb7d" (UID: "9787e5e3-5e75-4049-92ab-df4ef208cb7d"). InnerVolumeSpecName "kube-api-access-c4lbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.515564 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c4lbs\" (UniqueName: \"kubernetes.io/projected/9787e5e3-5e75-4049-92ab-df4ef208cb7d-kube-api-access-c4lbs\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.989802 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" event={"ID":"9787e5e3-5e75-4049-92ab-df4ef208cb7d","Type":"ContainerDied","Data":"31852cb133cff4cfc2cca63eec20c1487037a2b2f37d9304516db5711684bc9e"} Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.990326 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="31852cb133cff4cfc2cca63eec20c1487037a2b2f37d9304516db5711684bc9e" Mar 20 09:48:04 crc kubenswrapper[4958]: I0320 09:48:04.990409 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566668-w8hn7" Mar 20 09:48:05 crc kubenswrapper[4958]: I0320 09:48:05.311167 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-ckr2r"] Mar 20 09:48:05 crc kubenswrapper[4958]: I0320 09:48:05.318281 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566662-ckr2r"] Mar 20 09:48:06 crc kubenswrapper[4958]: I0320 09:48:06.445794 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e5a71229-c9d3-4e26-b3ac-e6baa545d204" path="/var/lib/kubelet/pods/e5a71229-c9d3-4e26-b3ac-e6baa545d204/volumes" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.461780 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v4rgb"] Mar 20 09:48:20 crc kubenswrapper[4958]: E0320 09:48:20.462983 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9787e5e3-5e75-4049-92ab-df4ef208cb7d" containerName="oc" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.462998 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9787e5e3-5e75-4049-92ab-df4ef208cb7d" containerName="oc" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.463143 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9787e5e3-5e75-4049-92ab-df4ef208cb7d" containerName="oc" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.464223 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.470654 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4rgb"] Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.609416 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-catalog-content\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.609481 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-utilities\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.609519 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k97vs\" (UniqueName: \"kubernetes.io/projected/4355440c-e52e-4b72-b1f9-7b93c9d960c0-kube-api-access-k97vs\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.710685 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-catalog-content\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.710753 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-utilities\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.710793 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k97vs\" (UniqueName: \"kubernetes.io/projected/4355440c-e52e-4b72-b1f9-7b93c9d960c0-kube-api-access-k97vs\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.711393 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-catalog-content\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.711453 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-utilities\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.732029 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k97vs\" (UniqueName: \"kubernetes.io/projected/4355440c-e52e-4b72-b1f9-7b93c9d960c0-kube-api-access-k97vs\") pod \"certified-operators-v4rgb\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.788922 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.853757 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-twjhj"] Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.855454 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:20 crc kubenswrapper[4958]: I0320 09:48:20.878270 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twjhj"] Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.014217 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706d1733-3305-41ee-b973-c39d579f4683-utilities\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.014823 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706d1733-3305-41ee-b973-c39d579f4683-catalog-content\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.014928 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcdcg\" (UniqueName: \"kubernetes.io/projected/706d1733-3305-41ee-b973-c39d579f4683-kube-api-access-rcdcg\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.115966 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcdcg\" (UniqueName: \"kubernetes.io/projected/706d1733-3305-41ee-b973-c39d579f4683-kube-api-access-rcdcg\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.116043 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706d1733-3305-41ee-b973-c39d579f4683-utilities\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.116078 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706d1733-3305-41ee-b973-c39d579f4683-catalog-content\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.116595 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/706d1733-3305-41ee-b973-c39d579f4683-catalog-content\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.116944 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/706d1733-3305-41ee-b973-c39d579f4683-utilities\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.163654 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcdcg\" (UniqueName: \"kubernetes.io/projected/706d1733-3305-41ee-b973-c39d579f4683-kube-api-access-rcdcg\") pod \"community-operators-twjhj\" (UID: \"706d1733-3305-41ee-b973-c39d579f4683\") " pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.191322 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.304144 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v4rgb"] Mar 20 09:48:21 crc kubenswrapper[4958]: I0320 09:48:21.748463 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twjhj"] Mar 20 09:48:22 crc kubenswrapper[4958]: I0320 09:48:22.135709 4958 generic.go:334] "Generic (PLEG): container finished" podID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerID="af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b" exitCode=0 Mar 20 09:48:22 crc kubenswrapper[4958]: I0320 09:48:22.135819 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerDied","Data":"af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b"} Mar 20 09:48:22 crc kubenswrapper[4958]: I0320 09:48:22.135863 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerStarted","Data":"1e89610ee47f1bb79ecbfcdb3f873df5fde68591c0c4e32a0c6d849931b039e9"} Mar 20 09:48:22 crc kubenswrapper[4958]: I0320 09:48:22.137535 4958 generic.go:334] "Generic (PLEG): container finished" podID="706d1733-3305-41ee-b973-c39d579f4683" containerID="7bd37d4774bc260ea5fb016f68c0ae07d2d4c04bc7ef24a75e412dbbaf240980" exitCode=0 Mar 20 09:48:22 crc kubenswrapper[4958]: I0320 09:48:22.137637 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twjhj" event={"ID":"706d1733-3305-41ee-b973-c39d579f4683","Type":"ContainerDied","Data":"7bd37d4774bc260ea5fb016f68c0ae07d2d4c04bc7ef24a75e412dbbaf240980"} Mar 20 09:48:22 crc kubenswrapper[4958]: I0320 09:48:22.137675 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twjhj" event={"ID":"706d1733-3305-41ee-b973-c39d579f4683","Type":"ContainerStarted","Data":"6681c1e21f09846536b1bc54f2e963c22fe56a51dfded465e638b1cf33619448"} Mar 20 09:48:23 crc kubenswrapper[4958]: I0320 09:48:23.152936 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerStarted","Data":"8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449"} Mar 20 09:48:24 crc kubenswrapper[4958]: I0320 09:48:24.163408 4958 generic.go:334] "Generic (PLEG): container finished" podID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerID="8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449" exitCode=0 Mar 20 09:48:24 crc kubenswrapper[4958]: I0320 09:48:24.163471 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerDied","Data":"8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449"} Mar 20 09:48:26 crc kubenswrapper[4958]: I0320 09:48:26.522259 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:48:26 crc kubenswrapper[4958]: I0320 09:48:26.524403 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:48:28 crc kubenswrapper[4958]: I0320 09:48:28.729858 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerStarted","Data":"f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c"} Mar 20 09:48:28 crc kubenswrapper[4958]: I0320 09:48:28.729913 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twjhj" event={"ID":"706d1733-3305-41ee-b973-c39d579f4683","Type":"ContainerStarted","Data":"5e0f95c5b74dc29cb8652eb524e305864d8f7aca746099de45824f56938d8f59"} Mar 20 09:48:28 crc kubenswrapper[4958]: I0320 09:48:28.760259 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v4rgb" podStartSLOduration=3.235475453 podStartE2EDuration="8.760239048s" podCreationTimestamp="2026-03-20 09:48:20 +0000 UTC" firstStartedPulling="2026-03-20 09:48:22.138031625 +0000 UTC m=+2922.460047583" lastFinishedPulling="2026-03-20 09:48:27.66279522 +0000 UTC m=+2927.984811178" observedRunningTime="2026-03-20 09:48:28.756567787 +0000 UTC m=+2929.078583745" watchObservedRunningTime="2026-03-20 09:48:28.760239048 +0000 UTC m=+2929.082255006" Mar 20 09:48:29 crc kubenswrapper[4958]: I0320 09:48:29.738971 4958 generic.go:334] "Generic (PLEG): container finished" podID="706d1733-3305-41ee-b973-c39d579f4683" containerID="5e0f95c5b74dc29cb8652eb524e305864d8f7aca746099de45824f56938d8f59" exitCode=0 Mar 20 09:48:29 crc kubenswrapper[4958]: I0320 09:48:29.739064 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twjhj" event={"ID":"706d1733-3305-41ee-b973-c39d579f4683","Type":"ContainerDied","Data":"5e0f95c5b74dc29cb8652eb524e305864d8f7aca746099de45824f56938d8f59"} Mar 20 09:48:30 crc kubenswrapper[4958]: I0320 09:48:30.789060 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:30 crc kubenswrapper[4958]: I0320 09:48:30.789639 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:30 crc kubenswrapper[4958]: I0320 09:48:30.842367 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:33 crc kubenswrapper[4958]: I0320 09:48:33.769041 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-twjhj" event={"ID":"706d1733-3305-41ee-b973-c39d579f4683","Type":"ContainerStarted","Data":"c3541822c9568c49a45198475380d659c2ffb02376379493c75d64b68075000f"} Mar 20 09:48:33 crc kubenswrapper[4958]: I0320 09:48:33.791043 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-twjhj" podStartSLOduration=3.058926596 podStartE2EDuration="13.791023431s" podCreationTimestamp="2026-03-20 09:48:20 +0000 UTC" firstStartedPulling="2026-03-20 09:48:22.141924052 +0000 UTC m=+2922.463940010" lastFinishedPulling="2026-03-20 09:48:32.874020897 +0000 UTC m=+2933.196036845" observedRunningTime="2026-03-20 09:48:33.787804332 +0000 UTC m=+2934.109820300" watchObservedRunningTime="2026-03-20 09:48:33.791023431 +0000 UTC m=+2934.113039389" Mar 20 09:48:40 crc kubenswrapper[4958]: I0320 09:48:40.860139 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:40 crc kubenswrapper[4958]: I0320 09:48:40.916005 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4rgb"] Mar 20 09:48:41 crc kubenswrapper[4958]: I0320 09:48:41.191704 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:41 crc kubenswrapper[4958]: I0320 09:48:41.192251 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:41 crc kubenswrapper[4958]: I0320 09:48:41.237725 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:41 crc kubenswrapper[4958]: I0320 09:48:41.851959 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v4rgb" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="registry-server" containerID="cri-o://f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c" gracePeriod=2 Mar 20 09:48:41 crc kubenswrapper[4958]: I0320 09:48:41.915191 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-twjhj" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.256275 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.279862 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-utilities\") pod \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.279969 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k97vs\" (UniqueName: \"kubernetes.io/projected/4355440c-e52e-4b72-b1f9-7b93c9d960c0-kube-api-access-k97vs\") pod \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.279996 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-catalog-content\") pod \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\" (UID: \"4355440c-e52e-4b72-b1f9-7b93c9d960c0\") " Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.281788 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-utilities" (OuterVolumeSpecName: "utilities") pod "4355440c-e52e-4b72-b1f9-7b93c9d960c0" (UID: "4355440c-e52e-4b72-b1f9-7b93c9d960c0"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.290675 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4355440c-e52e-4b72-b1f9-7b93c9d960c0-kube-api-access-k97vs" (OuterVolumeSpecName: "kube-api-access-k97vs") pod "4355440c-e52e-4b72-b1f9-7b93c9d960c0" (UID: "4355440c-e52e-4b72-b1f9-7b93c9d960c0"). InnerVolumeSpecName "kube-api-access-k97vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.335459 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4355440c-e52e-4b72-b1f9-7b93c9d960c0" (UID: "4355440c-e52e-4b72-b1f9-7b93c9d960c0"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.368469 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-twjhj"] Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.387993 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.388047 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k97vs\" (UniqueName: \"kubernetes.io/projected/4355440c-e52e-4b72-b1f9-7b93c9d960c0-kube-api-access-k97vs\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.388063 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4355440c-e52e-4b72-b1f9-7b93c9d960c0-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.704257 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbv9h"] Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.704675 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-rbv9h" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="registry-server" containerID="cri-o://0b2c700278493776cb0b09fd3e4fb34a7c6921b51536a6ac28817cc0a89dfc84" gracePeriod=2 Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.866205 4958 generic.go:334] "Generic (PLEG): container finished" podID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerID="f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c" exitCode=0 Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.866294 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerDied","Data":"f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c"} Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.866857 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v4rgb" event={"ID":"4355440c-e52e-4b72-b1f9-7b93c9d960c0","Type":"ContainerDied","Data":"1e89610ee47f1bb79ecbfcdb3f873df5fde68591c0c4e32a0c6d849931b039e9"} Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.866891 4958 scope.go:117] "RemoveContainer" containerID="f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.866414 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v4rgb" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.877809 4958 generic.go:334] "Generic (PLEG): container finished" podID="0930a6b5-25c2-441d-8204-b483adf7da51" containerID="0b2c700278493776cb0b09fd3e4fb34a7c6921b51536a6ac28817cc0a89dfc84" exitCode=0 Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.877893 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerDied","Data":"0b2c700278493776cb0b09fd3e4fb34a7c6921b51536a6ac28817cc0a89dfc84"} Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.897441 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v4rgb"] Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.909101 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v4rgb"] Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.909892 4958 scope.go:117] "RemoveContainer" containerID="8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.934006 4958 scope.go:117] "RemoveContainer" containerID="af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.958208 4958 scope.go:117] "RemoveContainer" containerID="f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c" Mar 20 09:48:42 crc kubenswrapper[4958]: E0320 09:48:42.960158 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c\": container with ID starting with f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c not found: ID does not exist" containerID="f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.960204 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c"} err="failed to get container status \"f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c\": rpc error: code = NotFound desc = could not find container \"f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c\": container with ID starting with f7d3b3208f1a6c408a4e80899048e0b95d506666c328cfd2702970f4eff33e3c not found: ID does not exist" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.960241 4958 scope.go:117] "RemoveContainer" containerID="8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449" Mar 20 09:48:42 crc kubenswrapper[4958]: E0320 09:48:42.961328 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449\": container with ID starting with 8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449 not found: ID does not exist" containerID="8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.961352 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449"} err="failed to get container status \"8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449\": rpc error: code = NotFound desc = could not find container \"8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449\": container with ID starting with 8418d461a42799ddb76883ab5fec9c246965b2af7eb47bcc01d2a215ad0ae449 not found: ID does not exist" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.961369 4958 scope.go:117] "RemoveContainer" containerID="af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b" Mar 20 09:48:42 crc kubenswrapper[4958]: E0320 09:48:42.969419 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b\": container with ID starting with af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b not found: ID does not exist" containerID="af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b" Mar 20 09:48:42 crc kubenswrapper[4958]: I0320 09:48:42.969463 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b"} err="failed to get container status \"af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b\": rpc error: code = NotFound desc = could not find container \"af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b\": container with ID starting with af0c9b7f65bae24d3e119d9420c149ad955139b8ddb3869ef84c8f8c3ff7c24b not found: ID does not exist" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.167849 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.199877 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-catalog-content\") pod \"0930a6b5-25c2-441d-8204-b483adf7da51\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.199996 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-utilities\") pod \"0930a6b5-25c2-441d-8204-b483adf7da51\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.200050 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42csf\" (UniqueName: \"kubernetes.io/projected/0930a6b5-25c2-441d-8204-b483adf7da51-kube-api-access-42csf\") pod \"0930a6b5-25c2-441d-8204-b483adf7da51\" (UID: \"0930a6b5-25c2-441d-8204-b483adf7da51\") " Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.200996 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-utilities" (OuterVolumeSpecName: "utilities") pod "0930a6b5-25c2-441d-8204-b483adf7da51" (UID: "0930a6b5-25c2-441d-8204-b483adf7da51"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.217856 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0930a6b5-25c2-441d-8204-b483adf7da51-kube-api-access-42csf" (OuterVolumeSpecName: "kube-api-access-42csf") pod "0930a6b5-25c2-441d-8204-b483adf7da51" (UID: "0930a6b5-25c2-441d-8204-b483adf7da51"). InnerVolumeSpecName "kube-api-access-42csf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.272046 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0930a6b5-25c2-441d-8204-b483adf7da51" (UID: "0930a6b5-25c2-441d-8204-b483adf7da51"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.302262 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42csf\" (UniqueName: \"kubernetes.io/projected/0930a6b5-25c2-441d-8204-b483adf7da51-kube-api-access-42csf\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.302315 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.302330 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0930a6b5-25c2-441d-8204-b483adf7da51-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.896315 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-rbv9h" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.897052 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-rbv9h" event={"ID":"0930a6b5-25c2-441d-8204-b483adf7da51","Type":"ContainerDied","Data":"3ead8d85be346e65114969c2b1885ef2f67063d8662920fdfa2e3ceb7a16db58"} Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.897229 4958 scope.go:117] "RemoveContainer" containerID="0b2c700278493776cb0b09fd3e4fb34a7c6921b51536a6ac28817cc0a89dfc84" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.919191 4958 scope.go:117] "RemoveContainer" containerID="c5a79b86bbee78c6d6b239b1f7e67a6452715e2acc9abc8ac79262809cc522a0" Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.932628 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-rbv9h"] Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.939889 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-rbv9h"] Mar 20 09:48:43 crc kubenswrapper[4958]: I0320 09:48:43.958737 4958 scope.go:117] "RemoveContainer" containerID="f2e9f6075254cc62fe265776201f32342fca72830925d1cabe65d30e0cd6fcb8" Mar 20 09:48:44 crc kubenswrapper[4958]: I0320 09:48:44.447079 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" path="/var/lib/kubelet/pods/0930a6b5-25c2-441d-8204-b483adf7da51/volumes" Mar 20 09:48:44 crc kubenswrapper[4958]: I0320 09:48:44.447919 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" path="/var/lib/kubelet/pods/4355440c-e52e-4b72-b1f9-7b93c9d960c0/volumes" Mar 20 09:48:44 crc kubenswrapper[4958]: I0320 09:48:44.951678 4958 scope.go:117] "RemoveContainer" containerID="bc4d2fb86b070afd148f7d2d96c276ad5cb11b98f783998f677ee32237c8205f" Mar 20 09:48:56 crc kubenswrapper[4958]: I0320 09:48:56.520937 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:48:56 crc kubenswrapper[4958]: I0320 09:48:56.521812 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:48:56 crc kubenswrapper[4958]: I0320 09:48:56.521880 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:48:56 crc kubenswrapper[4958]: I0320 09:48:56.522504 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:48:56 crc kubenswrapper[4958]: I0320 09:48:56.522565 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" gracePeriod=600 Mar 20 09:48:56 crc kubenswrapper[4958]: E0320 09:48:56.662461 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:48:57 crc kubenswrapper[4958]: I0320 09:48:57.016297 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" exitCode=0 Mar 20 09:48:57 crc kubenswrapper[4958]: I0320 09:48:57.016359 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e"} Mar 20 09:48:57 crc kubenswrapper[4958]: I0320 09:48:57.016399 4958 scope.go:117] "RemoveContainer" containerID="af7b0747b671c01f88ecefb15ee2c1afeec2577d02d578b1825d0e4a93acaa7b" Mar 20 09:48:57 crc kubenswrapper[4958]: I0320 09:48:57.017018 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:48:57 crc kubenswrapper[4958]: E0320 09:48:57.017334 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:49:12 crc kubenswrapper[4958]: I0320 09:49:12.435727 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:49:12 crc kubenswrapper[4958]: E0320 09:49:12.436631 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:49:25 crc kubenswrapper[4958]: I0320 09:49:25.434825 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:49:25 crc kubenswrapper[4958]: E0320 09:49:25.437127 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:49:37 crc kubenswrapper[4958]: I0320 09:49:37.435403 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:49:37 crc kubenswrapper[4958]: E0320 09:49:37.436575 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.981576 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-x98gp/must-gather-txcp8"] Mar 20 09:49:46 crc kubenswrapper[4958]: E0320 09:49:46.982517 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="extract-utilities" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.982544 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="extract-utilities" Mar 20 09:49:46 crc kubenswrapper[4958]: E0320 09:49:46.982562 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="extract-content" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.982572 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="extract-content" Mar 20 09:49:46 crc kubenswrapper[4958]: E0320 09:49:46.985671 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="extract-utilities" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.985707 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="extract-utilities" Mar 20 09:49:46 crc kubenswrapper[4958]: E0320 09:49:46.985745 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="registry-server" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.985753 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="registry-server" Mar 20 09:49:46 crc kubenswrapper[4958]: E0320 09:49:46.985767 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="extract-content" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.985775 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="extract-content" Mar 20 09:49:46 crc kubenswrapper[4958]: E0320 09:49:46.985803 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="registry-server" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.985809 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="registry-server" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.986089 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="0930a6b5-25c2-441d-8204-b483adf7da51" containerName="registry-server" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.986122 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="4355440c-e52e-4b72-b1f9-7b93c9d960c0" containerName="registry-server" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.987119 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.992198 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-x98gp"/"default-dockercfg-bdqnc" Mar 20 09:49:46 crc kubenswrapper[4958]: I0320 09:49:46.996669 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x98gp"/"kube-root-ca.crt" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.005370 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-x98gp"/"openshift-service-ca.crt" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.039678 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x98gp/must-gather-txcp8"] Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.119250 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c78sf\" (UniqueName: \"kubernetes.io/projected/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-kube-api-access-c78sf\") pod \"must-gather-txcp8\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.119514 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-must-gather-output\") pod \"must-gather-txcp8\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.221713 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c78sf\" (UniqueName: \"kubernetes.io/projected/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-kube-api-access-c78sf\") pod \"must-gather-txcp8\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.221890 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-must-gather-output\") pod \"must-gather-txcp8\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.222438 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-must-gather-output\") pod \"must-gather-txcp8\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.251161 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c78sf\" (UniqueName: \"kubernetes.io/projected/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-kube-api-access-c78sf\") pod \"must-gather-txcp8\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.306410 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:49:47 crc kubenswrapper[4958]: I0320 09:49:47.635628 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-x98gp/must-gather-txcp8"] Mar 20 09:49:48 crc kubenswrapper[4958]: I0320 09:49:48.411059 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x98gp/must-gather-txcp8" event={"ID":"d766b8f4-04b9-4f3e-ab18-6f44fb48861d","Type":"ContainerStarted","Data":"098f635e7fb941b1933087ec11f80a0bd756d67db0ce5c7f6947fb1bc9ca341b"} Mar 20 09:49:48 crc kubenswrapper[4958]: I0320 09:49:48.435813 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:49:48 crc kubenswrapper[4958]: E0320 09:49:48.436095 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:49:54 crc kubenswrapper[4958]: I0320 09:49:54.491379 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x98gp/must-gather-txcp8" event={"ID":"d766b8f4-04b9-4f3e-ab18-6f44fb48861d","Type":"ContainerStarted","Data":"c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb"} Mar 20 09:49:55 crc kubenswrapper[4958]: I0320 09:49:55.502095 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x98gp/must-gather-txcp8" event={"ID":"d766b8f4-04b9-4f3e-ab18-6f44fb48861d","Type":"ContainerStarted","Data":"fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127"} Mar 20 09:49:55 crc kubenswrapper[4958]: I0320 09:49:55.528467 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-x98gp/must-gather-txcp8" podStartSLOduration=2.943162019 podStartE2EDuration="9.528442567s" podCreationTimestamp="2026-03-20 09:49:46 +0000 UTC" firstStartedPulling="2026-03-20 09:49:47.642642442 +0000 UTC m=+3007.964658400" lastFinishedPulling="2026-03-20 09:49:54.22792297 +0000 UTC m=+3014.549938948" observedRunningTime="2026-03-20 09:49:55.521562668 +0000 UTC m=+3015.843578626" watchObservedRunningTime="2026-03-20 09:49:55.528442567 +0000 UTC m=+3015.850458525" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.148815 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566670-7pcdr"] Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.150857 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.154161 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.154314 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.154396 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.159407 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-7pcdr"] Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.343472 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6bsrb\" (UniqueName: \"kubernetes.io/projected/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0-kube-api-access-6bsrb\") pod \"auto-csr-approver-29566670-7pcdr\" (UID: \"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0\") " pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.444896 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6bsrb\" (UniqueName: \"kubernetes.io/projected/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0-kube-api-access-6bsrb\") pod \"auto-csr-approver-29566670-7pcdr\" (UID: \"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0\") " pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.464995 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6bsrb\" (UniqueName: \"kubernetes.io/projected/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0-kube-api-access-6bsrb\") pod \"auto-csr-approver-29566670-7pcdr\" (UID: \"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0\") " pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.478510 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:00 crc kubenswrapper[4958]: I0320 09:50:00.962827 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-7pcdr"] Mar 20 09:50:01 crc kubenswrapper[4958]: I0320 09:50:01.546408 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" event={"ID":"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0","Type":"ContainerStarted","Data":"fad13902da656355d33ffe1ab66705c978ced3d02337f1feba9a90a3c3f17dcc"} Mar 20 09:50:02 crc kubenswrapper[4958]: I0320 09:50:02.434815 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:50:02 crc kubenswrapper[4958]: E0320 09:50:02.435692 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:50:02 crc kubenswrapper[4958]: I0320 09:50:02.555075 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" event={"ID":"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0","Type":"ContainerStarted","Data":"9d55d80bbf3be6029382a4a7196dc90cb90da9def65df4da06e8e8924aa76d8a"} Mar 20 09:50:02 crc kubenswrapper[4958]: I0320 09:50:02.574902 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" podStartSLOduration=1.465086537 podStartE2EDuration="2.574875635s" podCreationTimestamp="2026-03-20 09:50:00 +0000 UTC" firstStartedPulling="2026-03-20 09:50:00.970916489 +0000 UTC m=+3021.292932447" lastFinishedPulling="2026-03-20 09:50:02.080705587 +0000 UTC m=+3022.402721545" observedRunningTime="2026-03-20 09:50:02.569661062 +0000 UTC m=+3022.891677030" watchObservedRunningTime="2026-03-20 09:50:02.574875635 +0000 UTC m=+3022.896891603" Mar 20 09:50:03 crc kubenswrapper[4958]: I0320 09:50:03.562889 4958 generic.go:334] "Generic (PLEG): container finished" podID="9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0" containerID="9d55d80bbf3be6029382a4a7196dc90cb90da9def65df4da06e8e8924aa76d8a" exitCode=0 Mar 20 09:50:03 crc kubenswrapper[4958]: I0320 09:50:03.562954 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" event={"ID":"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0","Type":"ContainerDied","Data":"9d55d80bbf3be6029382a4a7196dc90cb90da9def65df4da06e8e8924aa76d8a"} Mar 20 09:50:04 crc kubenswrapper[4958]: I0320 09:50:04.878305 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.015272 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6bsrb\" (UniqueName: \"kubernetes.io/projected/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0-kube-api-access-6bsrb\") pod \"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0\" (UID: \"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0\") " Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.024924 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0-kube-api-access-6bsrb" (OuterVolumeSpecName: "kube-api-access-6bsrb") pod "9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0" (UID: "9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0"). InnerVolumeSpecName "kube-api-access-6bsrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.118057 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6bsrb\" (UniqueName: \"kubernetes.io/projected/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0-kube-api-access-6bsrb\") on node \"crc\" DevicePath \"\"" Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.579224 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" event={"ID":"9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0","Type":"ContainerDied","Data":"fad13902da656355d33ffe1ab66705c978ced3d02337f1feba9a90a3c3f17dcc"} Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.579270 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fad13902da656355d33ffe1ab66705c978ced3d02337f1feba9a90a3c3f17dcc" Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.579349 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566670-7pcdr" Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.646400 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-5r9p5"] Mar 20 09:50:05 crc kubenswrapper[4958]: I0320 09:50:05.652715 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566664-5r9p5"] Mar 20 09:50:06 crc kubenswrapper[4958]: I0320 09:50:06.444185 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a9177dd-965c-4329-9672-7486c11a89a7" path="/var/lib/kubelet/pods/7a9177dd-965c-4329-9672-7486c11a89a7/volumes" Mar 20 09:50:15 crc kubenswrapper[4958]: I0320 09:50:15.434919 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:50:15 crc kubenswrapper[4958]: E0320 09:50:15.435899 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:50:27 crc kubenswrapper[4958]: I0320 09:50:27.435890 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:50:27 crc kubenswrapper[4958]: E0320 09:50:27.436933 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:50:39 crc kubenswrapper[4958]: I0320 09:50:39.434576 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:50:39 crc kubenswrapper[4958]: E0320 09:50:39.435686 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:50:42 crc kubenswrapper[4958]: I0320 09:50:42.899971 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-bshzx_f3958399-d780-4806-ae2c-2a2479b6d911/init/0.log" Mar 20 09:50:43 crc kubenswrapper[4958]: I0320 09:50:43.172076 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-bshzx_f3958399-d780-4806-ae2c-2a2479b6d911/init/0.log" Mar 20 09:50:43 crc kubenswrapper[4958]: I0320 09:50:43.189190 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-bshzx_f3958399-d780-4806-ae2c-2a2479b6d911/dnsmasq-dns/0.log" Mar 20 09:50:45 crc kubenswrapper[4958]: I0320 09:50:45.070908 4958 scope.go:117] "RemoveContainer" containerID="bbcf248690df1cc5c3c2e0924d8ddfee8e950dd1714ec7c5211445748b3ed157" Mar 20 09:50:50 crc kubenswrapper[4958]: I0320 09:50:50.440740 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:50:50 crc kubenswrapper[4958]: E0320 09:50:50.442190 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.383519 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/util/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.564426 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/pull/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.591637 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/util/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.598863 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/pull/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.771675 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/pull/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.789756 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/util/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.799866 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_b4a728943c10eb3a8d997059249394951a4c19e9bf2c22f3d6025f4badrf4hb_c8c24479-3659-4655-a67b-e4601afe1b52/extract/0.log" Mar 20 09:50:58 crc kubenswrapper[4958]: I0320 09:50:58.952234 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-tlzr6_afb56adf-873a-4757-90cb-62cc57e78669/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.187241 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-j4w4r_07df28d7-7683-4309-bee9-9aa2de96b9ce/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.452761 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-2f897_b381ba24-046d-4474-8581-6235812526a7/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.469940 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-b8zbp_22ddf7c6-5d86-436a-b6ea-a622e854725e/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.501009 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-4ljl2_668ba749-8ef8-42fc-bb13-7b5c6e207ed6/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.622256 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-69br5_60ab48da-f2e7-47d0-829e-922b0726e372/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.715928 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-577ccd856-pms6v_6d3c18bd-2666-4490-afbb-dbb844e5dc36/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.832028 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-wq2w4_af8e40f1-7e87-4ed7-8136-1ec1ad714bac/manager/0.log" Mar 20 09:50:59 crc kubenswrapper[4958]: I0320 09:50:59.932422 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-fvr27_46972026-e8fb-46c0-bd8a-93d33a1eaccd/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.048431 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-ch6hb_9d54ed62-2236-4fdc-9fdb-f2042817795e/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.188878 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-5572j_4721bc9e-cb87-47df-a166-cdd08d38568d/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.281561 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-qfwqm_7246ddd6-d5b3-48a0-8581-42e5ff63f6eb/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.435802 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-p95zp_1dc86ca0-19a7-44f2-90f4-40faf6f6308a/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.471963 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-bqxpp_049aadcd-754d-4c89-b1cf-8ae3aa2f7748/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.639081 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-fb9dm_58536825-54ec-4942-a17e-50d7db114ff9/manager/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.839895 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-9df8dd5fd-2jzxj_72562712-a7df-49b8-af2c-6482fd0dcef0/operator/0.log" Mar 20 09:51:00 crc kubenswrapper[4958]: I0320 09:51:00.910366 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-55958644c4-qr9t7_90e05567-054f-41de-a1b4-4dc11ae039db/manager/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.069132 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-2rnn4_6b91c78e-0310-4789-b3ef-caede75e5d1c/registry-server/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.159921 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-llgf2_88be297b-cdd1-4b8d-ae88-eb6219f0f156/manager/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.277734 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-pg9qm_70f92bb8-0cc8-4804-a8d9-d5d3441e953e/manager/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.456666 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-pfz7r_b9ad7ed0-c1c6-4e6e-ae98-29b02f2facdc/manager/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.498029 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-d8b2d_934a0099-92f4-4fd1-b910-28c8a0f50d1e/manager/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.668491 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-glfmx_21dbcd45-579e-42ed-a2ac-c0b9fc9482b8/manager/0.log" Mar 20 09:51:01 crc kubenswrapper[4958]: I0320 09:51:01.719050 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-tl8ls_6db78af7-a32c-44b8-8450-d9478c3f9b1f/manager/0.log" Mar 20 09:51:04 crc kubenswrapper[4958]: I0320 09:51:04.435544 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:51:04 crc kubenswrapper[4958]: E0320 09:51:04.436174 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:51:18 crc kubenswrapper[4958]: I0320 09:51:18.435356 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:51:18 crc kubenswrapper[4958]: E0320 09:51:18.436145 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:51:21 crc kubenswrapper[4958]: I0320 09:51:21.536980 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-v5svb_ebe11c99-e14e-4390-8fd6-6638f0c6ad16/control-plane-machine-set-operator/0.log" Mar 20 09:51:21 crc kubenswrapper[4958]: I0320 09:51:21.711146 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wxtz6_8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5/kube-rbac-proxy/0.log" Mar 20 09:51:21 crc kubenswrapper[4958]: I0320 09:51:21.742230 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-wxtz6_8c29af46-e9d1-4f7d-9ba2-d27a5a1680a5/machine-api-operator/0.log" Mar 20 09:51:32 crc kubenswrapper[4958]: I0320 09:51:32.435419 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:51:32 crc kubenswrapper[4958]: E0320 09:51:32.436470 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:51:34 crc kubenswrapper[4958]: I0320 09:51:34.275647 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-mgmxx_5f1f6ba4-f472-4abb-a53d-72e17ac83d43/cert-manager-controller/0.log" Mar 20 09:51:34 crc kubenswrapper[4958]: I0320 09:51:34.393945 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-67r7n_533c37c3-c235-4cc8-9937-96afff9fe513/cert-manager-cainjector/0.log" Mar 20 09:51:34 crc kubenswrapper[4958]: I0320 09:51:34.441757 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-2xx4x_46a3cd52-9d0b-48a4-bf54-39fb49633e56/cert-manager-webhook/0.log" Mar 20 09:51:46 crc kubenswrapper[4958]: I0320 09:51:46.435269 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:51:46 crc kubenswrapper[4958]: E0320 09:51:46.436361 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:51:46 crc kubenswrapper[4958]: I0320 09:51:46.966937 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-444cw_cc9c5966-5322-42c8-b89d-939904508cbf/nmstate-console-plugin/0.log" Mar 20 09:51:47 crc kubenswrapper[4958]: I0320 09:51:47.222664 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-jtx5n_7462bd93-791f-45b3-943b-9c5ebfdf90ee/nmstate-handler/0.log" Mar 20 09:51:47 crc kubenswrapper[4958]: I0320 09:51:47.289718 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-j25jd_edbe510d-bcd7-465b-82e6-8425666a3dae/kube-rbac-proxy/0.log" Mar 20 09:51:47 crc kubenswrapper[4958]: I0320 09:51:47.333011 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-j25jd_edbe510d-bcd7-465b-82e6-8425666a3dae/nmstate-metrics/0.log" Mar 20 09:51:47 crc kubenswrapper[4958]: I0320 09:51:47.505270 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-sjl76_f5dcbca6-977c-48d6-a65c-00cc3f7d8787/nmstate-operator/0.log" Mar 20 09:51:47 crc kubenswrapper[4958]: I0320 09:51:47.563405 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-kqv85_6c6f8675-4ddc-4254-ae04-40cd4b5199d6/nmstate-webhook/0.log" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.150297 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566672-zcldt"] Mar 20 09:52:00 crc kubenswrapper[4958]: E0320 09:52:00.151580 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0" containerName="oc" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.151625 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0" containerName="oc" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.151848 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0" containerName="oc" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.152537 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.155587 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.155653 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.155788 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.166919 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-zcldt"] Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.240704 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dz2ch\" (UniqueName: \"kubernetes.io/projected/7f2cd05c-c17b-44dd-85b7-4e5e183846d0-kube-api-access-dz2ch\") pod \"auto-csr-approver-29566672-zcldt\" (UID: \"7f2cd05c-c17b-44dd-85b7-4e5e183846d0\") " pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.342419 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dz2ch\" (UniqueName: \"kubernetes.io/projected/7f2cd05c-c17b-44dd-85b7-4e5e183846d0-kube-api-access-dz2ch\") pod \"auto-csr-approver-29566672-zcldt\" (UID: \"7f2cd05c-c17b-44dd-85b7-4e5e183846d0\") " pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.370851 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dz2ch\" (UniqueName: \"kubernetes.io/projected/7f2cd05c-c17b-44dd-85b7-4e5e183846d0-kube-api-access-dz2ch\") pod \"auto-csr-approver-29566672-zcldt\" (UID: \"7f2cd05c-c17b-44dd-85b7-4e5e183846d0\") " pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.477609 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:00 crc kubenswrapper[4958]: I0320 09:52:00.747376 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-zcldt"] Mar 20 09:52:01 crc kubenswrapper[4958]: I0320 09:52:01.435473 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:52:01 crc kubenswrapper[4958]: E0320 09:52:01.435945 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:52:01 crc kubenswrapper[4958]: I0320 09:52:01.445312 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566672-zcldt" event={"ID":"7f2cd05c-c17b-44dd-85b7-4e5e183846d0","Type":"ContainerStarted","Data":"829cb44d49b0aef10b5de3cdac9c0ea7f088b49ecc2ea661eb73daef579f8c73"} Mar 20 09:52:02 crc kubenswrapper[4958]: I0320 09:52:02.453284 4958 generic.go:334] "Generic (PLEG): container finished" podID="7f2cd05c-c17b-44dd-85b7-4e5e183846d0" containerID="b72022e48421b3eafb6795a1a6172a762f4735cf5dd2185e38139c1d706e7436" exitCode=0 Mar 20 09:52:02 crc kubenswrapper[4958]: I0320 09:52:02.453341 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566672-zcldt" event={"ID":"7f2cd05c-c17b-44dd-85b7-4e5e183846d0","Type":"ContainerDied","Data":"b72022e48421b3eafb6795a1a6172a762f4735cf5dd2185e38139c1d706e7436"} Mar 20 09:52:03 crc kubenswrapper[4958]: I0320 09:52:03.760699 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:03 crc kubenswrapper[4958]: I0320 09:52:03.805803 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dz2ch\" (UniqueName: \"kubernetes.io/projected/7f2cd05c-c17b-44dd-85b7-4e5e183846d0-kube-api-access-dz2ch\") pod \"7f2cd05c-c17b-44dd-85b7-4e5e183846d0\" (UID: \"7f2cd05c-c17b-44dd-85b7-4e5e183846d0\") " Mar 20 09:52:03 crc kubenswrapper[4958]: I0320 09:52:03.816829 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f2cd05c-c17b-44dd-85b7-4e5e183846d0-kube-api-access-dz2ch" (OuterVolumeSpecName: "kube-api-access-dz2ch") pod "7f2cd05c-c17b-44dd-85b7-4e5e183846d0" (UID: "7f2cd05c-c17b-44dd-85b7-4e5e183846d0"). InnerVolumeSpecName "kube-api-access-dz2ch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:52:03 crc kubenswrapper[4958]: I0320 09:52:03.908298 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dz2ch\" (UniqueName: \"kubernetes.io/projected/7f2cd05c-c17b-44dd-85b7-4e5e183846d0-kube-api-access-dz2ch\") on node \"crc\" DevicePath \"\"" Mar 20 09:52:04 crc kubenswrapper[4958]: I0320 09:52:04.472466 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566672-zcldt" event={"ID":"7f2cd05c-c17b-44dd-85b7-4e5e183846d0","Type":"ContainerDied","Data":"829cb44d49b0aef10b5de3cdac9c0ea7f088b49ecc2ea661eb73daef579f8c73"} Mar 20 09:52:04 crc kubenswrapper[4958]: I0320 09:52:04.472958 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="829cb44d49b0aef10b5de3cdac9c0ea7f088b49ecc2ea661eb73daef579f8c73" Mar 20 09:52:04 crc kubenswrapper[4958]: I0320 09:52:04.472703 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566672-zcldt" Mar 20 09:52:04 crc kubenswrapper[4958]: I0320 09:52:04.842348 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-76fjf"] Mar 20 09:52:04 crc kubenswrapper[4958]: I0320 09:52:04.848880 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566666-76fjf"] Mar 20 09:52:06 crc kubenswrapper[4958]: I0320 09:52:06.447216 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c0ffa391-7a3f-40c1-bb17-051efee4cc88" path="/var/lib/kubelet/pods/c0ffa391-7a3f-40c1-bb17-051efee4cc88/volumes" Mar 20 09:52:14 crc kubenswrapper[4958]: I0320 09:52:14.886783 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hwwvd_d29fc852-1061-4f79-a204-3dc6a4f73e6c/controller/0.log" Mar 20 09:52:14 crc kubenswrapper[4958]: I0320 09:52:14.887201 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-hwwvd_d29fc852-1061-4f79-a204-3dc6a4f73e6c/kube-rbac-proxy/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.110055 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-frr-files/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.287061 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-reloader/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.305135 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-metrics/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.318274 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-frr-files/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.348063 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-reloader/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.539822 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-reloader/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.541244 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-metrics/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.542646 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-frr-files/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.592834 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-metrics/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.779842 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-frr-files/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.789076 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-reloader/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.795058 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/controller/0.log" Mar 20 09:52:15 crc kubenswrapper[4958]: I0320 09:52:15.826933 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/cp-metrics/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.023625 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/kube-rbac-proxy/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.037476 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/frr-metrics/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.063339 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/kube-rbac-proxy-frr/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.235706 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/reloader/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.262670 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-wbqjj_82a3e089-0afe-4bc8-addb-c3e2ceb6bbfb/frr-k8s-webhook-server/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.335117 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-jsg5p_3669e607-3d8e-4e9e-8468-26d0032e0590/frr/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.434736 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:52:16 crc kubenswrapper[4958]: E0320 09:52:16.435226 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.480844 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-65b48c4558-h8dcf_c1c0a68d-5950-4e09-a7e9-918863cf2008/manager/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.564062 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-79b7b75cdf-mmtj6_fdf4b931-9e36-44d0-b69b-7156d89875d9/webhook-server/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.707514 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zt86p_83a41007-6a0b-499e-b7e0-5dbaabb47a9c/kube-rbac-proxy/0.log" Mar 20 09:52:16 crc kubenswrapper[4958]: I0320 09:52:16.873063 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-zt86p_83a41007-6a0b-499e-b7e0-5dbaabb47a9c/speaker/0.log" Mar 20 09:52:27 crc kubenswrapper[4958]: I0320 09:52:27.435413 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:52:27 crc kubenswrapper[4958]: E0320 09:52:27.436186 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:52:30 crc kubenswrapper[4958]: I0320 09:52:30.264236 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/util/0.log" Mar 20 09:52:30 crc kubenswrapper[4958]: I0320 09:52:30.784218 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/pull/0.log" Mar 20 09:52:30 crc kubenswrapper[4958]: I0320 09:52:30.791354 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/util/0.log" Mar 20 09:52:30 crc kubenswrapper[4958]: I0320 09:52:30.896986 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/pull/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.126012 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/pull/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.170448 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/extract/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.211501 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874bp2qn_e0b23e56-fd65-47bf-9aae-fc730031e274/util/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.466303 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/util/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.692006 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/util/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.765069 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/pull/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.765085 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/pull/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.968690 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/util/0.log" Mar 20 09:52:31 crc kubenswrapper[4958]: I0320 09:52:31.993908 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/pull/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.000228 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1j2b2b_2f5ce30c-74f6-431c-9df1-32530fdc4ade/extract/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.174613 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/extract-utilities/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.368106 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/extract-utilities/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.388809 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/extract-content/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.398720 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/extract-content/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.609572 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/extract-content/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.629253 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/extract-utilities/0.log" Mar 20 09:52:32 crc kubenswrapper[4958]: I0320 09:52:32.847907 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/extract-utilities/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.153689 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/extract-content/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.207876 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/extract-content/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.214751 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/extract-utilities/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.409542 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-hghdm_75f0af6a-35bc-4beb-bd7e-4a7c1c37155d/registry-server/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.466973 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/extract-utilities/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.557268 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/extract-content/0.log" Mar 20 09:52:33 crc kubenswrapper[4958]: I0320 09:52:33.847356 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-twjhj_706d1733-3305-41ee-b973-c39d579f4683/registry-server/0.log" Mar 20 09:52:34 crc kubenswrapper[4958]: I0320 09:52:34.349287 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-66h4r_36a69577-98bd-420f-b49a-f004c20de1e0/marketplace-operator/0.log" Mar 20 09:52:34 crc kubenswrapper[4958]: I0320 09:52:34.487034 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/extract-utilities/0.log" Mar 20 09:52:34 crc kubenswrapper[4958]: I0320 09:52:34.730648 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/extract-utilities/0.log" Mar 20 09:52:34 crc kubenswrapper[4958]: I0320 09:52:34.770671 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/extract-content/0.log" Mar 20 09:52:34 crc kubenswrapper[4958]: I0320 09:52:34.778670 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/extract-content/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.000397 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/extract-utilities/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.055500 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/extract-content/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.125269 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-z2tl4_98737b72-788c-4867-b476-d0723c9111d1/registry-server/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.280292 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/extract-utilities/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.497344 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/extract-content/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.504001 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/extract-utilities/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.555147 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/extract-content/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.710897 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/extract-content/0.log" Mar 20 09:52:35 crc kubenswrapper[4958]: I0320 09:52:35.747647 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/extract-utilities/0.log" Mar 20 09:52:36 crc kubenswrapper[4958]: I0320 09:52:36.101827 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-779ld_e817fe38-a7fc-4fc7-8eec-739e3c76b459/registry-server/0.log" Mar 20 09:52:38 crc kubenswrapper[4958]: I0320 09:52:38.435115 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:52:38 crc kubenswrapper[4958]: E0320 09:52:38.435759 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:52:45 crc kubenswrapper[4958]: I0320 09:52:45.193961 4958 scope.go:117] "RemoveContainer" containerID="6e17b1aef004db6104a87bbf11f375c90d6f20469fed139d6c371457397d0b6e" Mar 20 09:52:50 crc kubenswrapper[4958]: I0320 09:52:50.439336 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:52:50 crc kubenswrapper[4958]: E0320 09:52:50.440187 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:53:01 crc kubenswrapper[4958]: I0320 09:53:01.434847 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:53:01 crc kubenswrapper[4958]: E0320 09:53:01.435996 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:53:16 crc kubenswrapper[4958]: I0320 09:53:16.469026 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:53:16 crc kubenswrapper[4958]: E0320 09:53:16.470048 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:53:29 crc kubenswrapper[4958]: I0320 09:53:29.434795 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:53:29 crc kubenswrapper[4958]: E0320 09:53:29.436382 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:53:40 crc kubenswrapper[4958]: I0320 09:53:40.442829 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:53:40 crc kubenswrapper[4958]: E0320 09:53:40.443924 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:53:52 crc kubenswrapper[4958]: I0320 09:53:52.435428 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:53:52 crc kubenswrapper[4958]: E0320 09:53:52.436913 4958 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-kvsdf_openshift-machine-config-operator(d3bb0dff-98a7-4359-841f-5fb469ebc3f4)\"" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" Mar 20 09:53:54 crc kubenswrapper[4958]: I0320 09:53:54.395698 4958 generic.go:334] "Generic (PLEG): container finished" podID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerID="c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb" exitCode=0 Mar 20 09:53:54 crc kubenswrapper[4958]: I0320 09:53:54.395757 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-x98gp/must-gather-txcp8" event={"ID":"d766b8f4-04b9-4f3e-ab18-6f44fb48861d","Type":"ContainerDied","Data":"c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb"} Mar 20 09:53:54 crc kubenswrapper[4958]: I0320 09:53:54.396466 4958 scope.go:117] "RemoveContainer" containerID="c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb" Mar 20 09:53:55 crc kubenswrapper[4958]: I0320 09:53:55.445396 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x98gp_must-gather-txcp8_d766b8f4-04b9-4f3e-ab18-6f44fb48861d/gather/0.log" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.178564 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566674-28xz7"] Mar 20 09:54:00 crc kubenswrapper[4958]: E0320 09:54:00.180025 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f2cd05c-c17b-44dd-85b7-4e5e183846d0" containerName="oc" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.180049 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f2cd05c-c17b-44dd-85b7-4e5e183846d0" containerName="oc" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.180243 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f2cd05c-c17b-44dd-85b7-4e5e183846d0" containerName="oc" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.180867 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.184314 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.184410 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.184338 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.197446 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566674-28xz7"] Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.295986 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nds95\" (UniqueName: \"kubernetes.io/projected/9e464967-8e2e-42fd-836b-234d2723891f-kube-api-access-nds95\") pod \"auto-csr-approver-29566674-28xz7\" (UID: \"9e464967-8e2e-42fd-836b-234d2723891f\") " pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.397593 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nds95\" (UniqueName: \"kubernetes.io/projected/9e464967-8e2e-42fd-836b-234d2723891f-kube-api-access-nds95\") pod \"auto-csr-approver-29566674-28xz7\" (UID: \"9e464967-8e2e-42fd-836b-234d2723891f\") " pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.436118 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nds95\" (UniqueName: \"kubernetes.io/projected/9e464967-8e2e-42fd-836b-234d2723891f-kube-api-access-nds95\") pod \"auto-csr-approver-29566674-28xz7\" (UID: \"9e464967-8e2e-42fd-836b-234d2723891f\") " pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.510487 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.812146 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566674-28xz7"] Mar 20 09:54:00 crc kubenswrapper[4958]: W0320 09:54:00.822838 4958 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e464967_8e2e_42fd_836b_234d2723891f.slice/crio-c26a291c5c05340bbe0c0eb70718be26b9ba3e494de4fb030ae1eaf6389f90c6 WatchSource:0}: Error finding container c26a291c5c05340bbe0c0eb70718be26b9ba3e494de4fb030ae1eaf6389f90c6: Status 404 returned error can't find the container with id c26a291c5c05340bbe0c0eb70718be26b9ba3e494de4fb030ae1eaf6389f90c6 Mar 20 09:54:00 crc kubenswrapper[4958]: I0320 09:54:00.826192 4958 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 09:54:01 crc kubenswrapper[4958]: I0320 09:54:01.479092 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566674-28xz7" event={"ID":"9e464967-8e2e-42fd-836b-234d2723891f","Type":"ContainerStarted","Data":"c26a291c5c05340bbe0c0eb70718be26b9ba3e494de4fb030ae1eaf6389f90c6"} Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.486435 4958 generic.go:334] "Generic (PLEG): container finished" podID="9e464967-8e2e-42fd-836b-234d2723891f" containerID="2ff8472c8e6ef4452e6bbb502019dab9576dacb0c84a120bda0162bbf1687ed6" exitCode=0 Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.486501 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566674-28xz7" event={"ID":"9e464967-8e2e-42fd-836b-234d2723891f","Type":"ContainerDied","Data":"2ff8472c8e6ef4452e6bbb502019dab9576dacb0c84a120bda0162bbf1687ed6"} Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.516185 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-x98gp/must-gather-txcp8"] Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.516568 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-x98gp/must-gather-txcp8" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="copy" containerID="cri-o://fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127" gracePeriod=2 Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.523334 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-x98gp/must-gather-txcp8"] Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.932982 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x98gp_must-gather-txcp8_d766b8f4-04b9-4f3e-ab18-6f44fb48861d/copy/0.log" Mar 20 09:54:02 crc kubenswrapper[4958]: I0320 09:54:02.933934 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.038975 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c78sf\" (UniqueName: \"kubernetes.io/projected/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-kube-api-access-c78sf\") pod \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.039087 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-must-gather-output\") pod \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\" (UID: \"d766b8f4-04b9-4f3e-ab18-6f44fb48861d\") " Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.046282 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-kube-api-access-c78sf" (OuterVolumeSpecName: "kube-api-access-c78sf") pod "d766b8f4-04b9-4f3e-ab18-6f44fb48861d" (UID: "d766b8f4-04b9-4f3e-ab18-6f44fb48861d"). InnerVolumeSpecName "kube-api-access-c78sf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.139911 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "d766b8f4-04b9-4f3e-ab18-6f44fb48861d" (UID: "d766b8f4-04b9-4f3e-ab18-6f44fb48861d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.141055 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c78sf\" (UniqueName: \"kubernetes.io/projected/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-kube-api-access-c78sf\") on node \"crc\" DevicePath \"\"" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.141154 4958 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/d766b8f4-04b9-4f3e-ab18-6f44fb48861d-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.495448 4958 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-x98gp_must-gather-txcp8_d766b8f4-04b9-4f3e-ab18-6f44fb48861d/copy/0.log" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.495839 4958 generic.go:334] "Generic (PLEG): container finished" podID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerID="fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127" exitCode=143 Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.496060 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-x98gp/must-gather-txcp8" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.501998 4958 scope.go:117] "RemoveContainer" containerID="fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.550636 4958 scope.go:117] "RemoveContainer" containerID="c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.632077 4958 scope.go:117] "RemoveContainer" containerID="fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127" Mar 20 09:54:03 crc kubenswrapper[4958]: E0320 09:54:03.632703 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127\": container with ID starting with fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127 not found: ID does not exist" containerID="fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.632762 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127"} err="failed to get container status \"fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127\": rpc error: code = NotFound desc = could not find container \"fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127\": container with ID starting with fb2b1c76faa11875511fbfd395217376d13bc935dab9033c3682b37cb206c127 not found: ID does not exist" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.632799 4958 scope.go:117] "RemoveContainer" containerID="c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb" Mar 20 09:54:03 crc kubenswrapper[4958]: E0320 09:54:03.633124 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb\": container with ID starting with c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb not found: ID does not exist" containerID="c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.633150 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb"} err="failed to get container status \"c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb\": rpc error: code = NotFound desc = could not find container \"c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb\": container with ID starting with c7e0aa86a1747c28ba5ac6e2a47110f3212132fc81c6c8935d611d21ef60e6eb not found: ID does not exist" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.789366 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.952869 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nds95\" (UniqueName: \"kubernetes.io/projected/9e464967-8e2e-42fd-836b-234d2723891f-kube-api-access-nds95\") pod \"9e464967-8e2e-42fd-836b-234d2723891f\" (UID: \"9e464967-8e2e-42fd-836b-234d2723891f\") " Mar 20 09:54:03 crc kubenswrapper[4958]: I0320 09:54:03.957768 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e464967-8e2e-42fd-836b-234d2723891f-kube-api-access-nds95" (OuterVolumeSpecName: "kube-api-access-nds95") pod "9e464967-8e2e-42fd-836b-234d2723891f" (UID: "9e464967-8e2e-42fd-836b-234d2723891f"). InnerVolumeSpecName "kube-api-access-nds95". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.054628 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nds95\" (UniqueName: \"kubernetes.io/projected/9e464967-8e2e-42fd-836b-234d2723891f-kube-api-access-nds95\") on node \"crc\" DevicePath \"\"" Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.444242 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" path="/var/lib/kubelet/pods/d766b8f4-04b9-4f3e-ab18-6f44fb48861d/volumes" Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.504151 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566674-28xz7" event={"ID":"9e464967-8e2e-42fd-836b-234d2723891f","Type":"ContainerDied","Data":"c26a291c5c05340bbe0c0eb70718be26b9ba3e494de4fb030ae1eaf6389f90c6"} Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.504222 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c26a291c5c05340bbe0c0eb70718be26b9ba3e494de4fb030ae1eaf6389f90c6" Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.504221 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566674-28xz7" Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.853585 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-w8hn7"] Mar 20 09:54:04 crc kubenswrapper[4958]: I0320 09:54:04.861133 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566668-w8hn7"] Mar 20 09:54:05 crc kubenswrapper[4958]: I0320 09:54:05.435219 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:54:06 crc kubenswrapper[4958]: I0320 09:54:06.444866 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9787e5e3-5e75-4049-92ab-df4ef208cb7d" path="/var/lib/kubelet/pods/9787e5e3-5e75-4049-92ab-df4ef208cb7d/volumes" Mar 20 09:54:06 crc kubenswrapper[4958]: I0320 09:54:06.523388 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"1e1383ccb143b7dd2d877c7552557c47060151058968b967f4edfee63a4ba3bc"} Mar 20 09:54:45 crc kubenswrapper[4958]: I0320 09:54:45.280535 4958 scope.go:117] "RemoveContainer" containerID="d6307234328af14b8a00524b3cb057e314de9b89ef00c89bd9ac3ca1bea09642" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.165734 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566676-khr25"] Mar 20 09:56:00 crc kubenswrapper[4958]: E0320 09:56:00.167159 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="gather" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167177 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="gather" Mar 20 09:56:00 crc kubenswrapper[4958]: E0320 09:56:00.167194 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e464967-8e2e-42fd-836b-234d2723891f" containerName="oc" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167202 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e464967-8e2e-42fd-836b-234d2723891f" containerName="oc" Mar 20 09:56:00 crc kubenswrapper[4958]: E0320 09:56:00.167232 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="copy" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167241 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="copy" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167385 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="gather" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167403 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="d766b8f4-04b9-4f3e-ab18-6f44fb48861d" containerName="copy" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167411 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e464967-8e2e-42fd-836b-234d2723891f" containerName="oc" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.167988 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.170279 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.170996 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.172907 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.182685 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566676-khr25"] Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.337522 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l5pk4\" (UniqueName: \"kubernetes.io/projected/465d4af8-206c-4498-973d-0af3e787f461-kube-api-access-l5pk4\") pod \"auto-csr-approver-29566676-khr25\" (UID: \"465d4af8-206c-4498-973d-0af3e787f461\") " pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.440339 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l5pk4\" (UniqueName: \"kubernetes.io/projected/465d4af8-206c-4498-973d-0af3e787f461-kube-api-access-l5pk4\") pod \"auto-csr-approver-29566676-khr25\" (UID: \"465d4af8-206c-4498-973d-0af3e787f461\") " pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.477344 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l5pk4\" (UniqueName: \"kubernetes.io/projected/465d4af8-206c-4498-973d-0af3e787f461-kube-api-access-l5pk4\") pod \"auto-csr-approver-29566676-khr25\" (UID: \"465d4af8-206c-4498-973d-0af3e787f461\") " pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.495081 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:00 crc kubenswrapper[4958]: I0320 09:56:00.996260 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566676-khr25"] Mar 20 09:56:01 crc kubenswrapper[4958]: I0320 09:56:01.786048 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566676-khr25" event={"ID":"465d4af8-206c-4498-973d-0af3e787f461","Type":"ContainerStarted","Data":"d3bc1b5931e9c3993541b0b27f1d214fc6c6814672bfe8b059fb175cc3364831"} Mar 20 09:56:02 crc kubenswrapper[4958]: I0320 09:56:02.795562 4958 generic.go:334] "Generic (PLEG): container finished" podID="465d4af8-206c-4498-973d-0af3e787f461" containerID="ff29683aecd20eff46b1d29c51d89d499ae04476bdf559d6e7da809f0799bfac" exitCode=0 Mar 20 09:56:02 crc kubenswrapper[4958]: I0320 09:56:02.795665 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566676-khr25" event={"ID":"465d4af8-206c-4498-973d-0af3e787f461","Type":"ContainerDied","Data":"ff29683aecd20eff46b1d29c51d89d499ae04476bdf559d6e7da809f0799bfac"} Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.140205 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.210639 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l5pk4\" (UniqueName: \"kubernetes.io/projected/465d4af8-206c-4498-973d-0af3e787f461-kube-api-access-l5pk4\") pod \"465d4af8-206c-4498-973d-0af3e787f461\" (UID: \"465d4af8-206c-4498-973d-0af3e787f461\") " Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.220524 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/465d4af8-206c-4498-973d-0af3e787f461-kube-api-access-l5pk4" (OuterVolumeSpecName: "kube-api-access-l5pk4") pod "465d4af8-206c-4498-973d-0af3e787f461" (UID: "465d4af8-206c-4498-973d-0af3e787f461"). InnerVolumeSpecName "kube-api-access-l5pk4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.312187 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l5pk4\" (UniqueName: \"kubernetes.io/projected/465d4af8-206c-4498-973d-0af3e787f461-kube-api-access-l5pk4\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.814463 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566676-khr25" event={"ID":"465d4af8-206c-4498-973d-0af3e787f461","Type":"ContainerDied","Data":"d3bc1b5931e9c3993541b0b27f1d214fc6c6814672bfe8b059fb175cc3364831"} Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.814509 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d3bc1b5931e9c3993541b0b27f1d214fc6c6814672bfe8b059fb175cc3364831" Mar 20 09:56:04 crc kubenswrapper[4958]: I0320 09:56:04.814690 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566676-khr25" Mar 20 09:56:05 crc kubenswrapper[4958]: I0320 09:56:05.222982 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-7pcdr"] Mar 20 09:56:05 crc kubenswrapper[4958]: I0320 09:56:05.230651 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566670-7pcdr"] Mar 20 09:56:06 crc kubenswrapper[4958]: I0320 09:56:06.445280 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0" path="/var/lib/kubelet/pods/9d7dbcbf-6187-4ad9-a702-45f2ebcbf9d0/volumes" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.484814 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-ws9lp"] Mar 20 09:56:18 crc kubenswrapper[4958]: E0320 09:56:18.486934 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="465d4af8-206c-4498-973d-0af3e787f461" containerName="oc" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.487017 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="465d4af8-206c-4498-973d-0af3e787f461" containerName="oc" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.487273 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="465d4af8-206c-4498-973d-0af3e787f461" containerName="oc" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.488609 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.495964 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ws9lp"] Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.648354 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-utilities\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.648644 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-catalog-content\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.648771 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrbgp\" (UniqueName: \"kubernetes.io/projected/fd62ee1d-41a1-4276-a2d5-42662160eed4-kube-api-access-zrbgp\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.749550 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-utilities\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.749646 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-catalog-content\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.749723 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrbgp\" (UniqueName: \"kubernetes.io/projected/fd62ee1d-41a1-4276-a2d5-42662160eed4-kube-api-access-zrbgp\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.750200 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-catalog-content\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.750445 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-utilities\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.774937 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrbgp\" (UniqueName: \"kubernetes.io/projected/fd62ee1d-41a1-4276-a2d5-42662160eed4-kube-api-access-zrbgp\") pod \"redhat-operators-ws9lp\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:18 crc kubenswrapper[4958]: I0320 09:56:18.818962 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:19 crc kubenswrapper[4958]: I0320 09:56:19.255572 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-ws9lp"] Mar 20 09:56:19 crc kubenswrapper[4958]: I0320 09:56:19.947354 4958 generic.go:334] "Generic (PLEG): container finished" podID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerID="25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8" exitCode=0 Mar 20 09:56:19 crc kubenswrapper[4958]: I0320 09:56:19.947431 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ws9lp" event={"ID":"fd62ee1d-41a1-4276-a2d5-42662160eed4","Type":"ContainerDied","Data":"25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8"} Mar 20 09:56:19 crc kubenswrapper[4958]: I0320 09:56:19.947836 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ws9lp" event={"ID":"fd62ee1d-41a1-4276-a2d5-42662160eed4","Type":"ContainerStarted","Data":"77cb7b1ae4dacbe9587f2c2bc20c1604e7d31ff9ff0b86f3e3b4769e1f430544"} Mar 20 09:56:21 crc kubenswrapper[4958]: I0320 09:56:21.968780 4958 generic.go:334] "Generic (PLEG): container finished" podID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerID="3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f" exitCode=0 Mar 20 09:56:21 crc kubenswrapper[4958]: I0320 09:56:21.969193 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ws9lp" event={"ID":"fd62ee1d-41a1-4276-a2d5-42662160eed4","Type":"ContainerDied","Data":"3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f"} Mar 20 09:56:22 crc kubenswrapper[4958]: I0320 09:56:22.978325 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ws9lp" event={"ID":"fd62ee1d-41a1-4276-a2d5-42662160eed4","Type":"ContainerStarted","Data":"0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d"} Mar 20 09:56:23 crc kubenswrapper[4958]: I0320 09:56:23.005001 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-ws9lp" podStartSLOduration=2.481525043 podStartE2EDuration="5.004974062s" podCreationTimestamp="2026-03-20 09:56:18 +0000 UTC" firstStartedPulling="2026-03-20 09:56:19.950229494 +0000 UTC m=+3400.272245452" lastFinishedPulling="2026-03-20 09:56:22.473678513 +0000 UTC m=+3402.795694471" observedRunningTime="2026-03-20 09:56:22.999369268 +0000 UTC m=+3403.321385246" watchObservedRunningTime="2026-03-20 09:56:23.004974062 +0000 UTC m=+3403.326990020" Mar 20 09:56:26 crc kubenswrapper[4958]: I0320 09:56:26.521497 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:56:26 crc kubenswrapper[4958]: I0320 09:56:26.522060 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:56:28 crc kubenswrapper[4958]: I0320 09:56:28.823156 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:28 crc kubenswrapper[4958]: I0320 09:56:28.825954 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:29 crc kubenswrapper[4958]: I0320 09:56:29.889297 4958 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-ws9lp" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="registry-server" probeResult="failure" output=< Mar 20 09:56:29 crc kubenswrapper[4958]: timeout: failed to connect service ":50051" within 1s Mar 20 09:56:29 crc kubenswrapper[4958]: > Mar 20 09:56:38 crc kubenswrapper[4958]: I0320 09:56:38.875025 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:38 crc kubenswrapper[4958]: I0320 09:56:38.928388 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:39 crc kubenswrapper[4958]: I0320 09:56:39.124926 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ws9lp"] Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.150904 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-ws9lp" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="registry-server" containerID="cri-o://0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d" gracePeriod=2 Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.621337 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.719163 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-catalog-content\") pod \"fd62ee1d-41a1-4276-a2d5-42662160eed4\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.719247 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zrbgp\" (UniqueName: \"kubernetes.io/projected/fd62ee1d-41a1-4276-a2d5-42662160eed4-kube-api-access-zrbgp\") pod \"fd62ee1d-41a1-4276-a2d5-42662160eed4\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.719330 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-utilities\") pod \"fd62ee1d-41a1-4276-a2d5-42662160eed4\" (UID: \"fd62ee1d-41a1-4276-a2d5-42662160eed4\") " Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.721309 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-utilities" (OuterVolumeSpecName: "utilities") pod "fd62ee1d-41a1-4276-a2d5-42662160eed4" (UID: "fd62ee1d-41a1-4276-a2d5-42662160eed4"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.727294 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd62ee1d-41a1-4276-a2d5-42662160eed4-kube-api-access-zrbgp" (OuterVolumeSpecName: "kube-api-access-zrbgp") pod "fd62ee1d-41a1-4276-a2d5-42662160eed4" (UID: "fd62ee1d-41a1-4276-a2d5-42662160eed4"). InnerVolumeSpecName "kube-api-access-zrbgp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.821326 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zrbgp\" (UniqueName: \"kubernetes.io/projected/fd62ee1d-41a1-4276-a2d5-42662160eed4-kube-api-access-zrbgp\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.821390 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.895529 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fd62ee1d-41a1-4276-a2d5-42662160eed4" (UID: "fd62ee1d-41a1-4276-a2d5-42662160eed4"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:56:40 crc kubenswrapper[4958]: I0320 09:56:40.922515 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fd62ee1d-41a1-4276-a2d5-42662160eed4-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.163546 4958 generic.go:334] "Generic (PLEG): container finished" podID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerID="0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d" exitCode=0 Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.163639 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ws9lp" event={"ID":"fd62ee1d-41a1-4276-a2d5-42662160eed4","Type":"ContainerDied","Data":"0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d"} Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.163656 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-ws9lp" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.163698 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-ws9lp" event={"ID":"fd62ee1d-41a1-4276-a2d5-42662160eed4","Type":"ContainerDied","Data":"77cb7b1ae4dacbe9587f2c2bc20c1604e7d31ff9ff0b86f3e3b4769e1f430544"} Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.163724 4958 scope.go:117] "RemoveContainer" containerID="0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.191460 4958 scope.go:117] "RemoveContainer" containerID="3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.209342 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-ws9lp"] Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.216178 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-ws9lp"] Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.225008 4958 scope.go:117] "RemoveContainer" containerID="25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.277082 4958 scope.go:117] "RemoveContainer" containerID="0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d" Mar 20 09:56:41 crc kubenswrapper[4958]: E0320 09:56:41.277741 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d\": container with ID starting with 0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d not found: ID does not exist" containerID="0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.277778 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d"} err="failed to get container status \"0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d\": rpc error: code = NotFound desc = could not find container \"0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d\": container with ID starting with 0cb24152d0017e61b28e9198768217e352f07cc8e665cc78ac1d86360ebc163d not found: ID does not exist" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.277801 4958 scope.go:117] "RemoveContainer" containerID="3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f" Mar 20 09:56:41 crc kubenswrapper[4958]: E0320 09:56:41.278146 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f\": container with ID starting with 3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f not found: ID does not exist" containerID="3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.278181 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f"} err="failed to get container status \"3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f\": rpc error: code = NotFound desc = could not find container \"3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f\": container with ID starting with 3137cb4d0808de054df48cbb2a7fa5a58edc15a894c15aa7336574bf07e8ea0f not found: ID does not exist" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.278203 4958 scope.go:117] "RemoveContainer" containerID="25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8" Mar 20 09:56:41 crc kubenswrapper[4958]: E0320 09:56:41.278694 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8\": container with ID starting with 25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8 not found: ID does not exist" containerID="25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8" Mar 20 09:56:41 crc kubenswrapper[4958]: I0320 09:56:41.278750 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8"} err="failed to get container status \"25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8\": rpc error: code = NotFound desc = could not find container \"25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8\": container with ID starting with 25f884235789fe055dce224b548e2b211580d8372e303a62c37c5b40f709fec8 not found: ID does not exist" Mar 20 09:56:42 crc kubenswrapper[4958]: I0320 09:56:42.446207 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" path="/var/lib/kubelet/pods/fd62ee1d-41a1-4276-a2d5-42662160eed4/volumes" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.533742 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rqxsr"] Mar 20 09:56:43 crc kubenswrapper[4958]: E0320 09:56:43.534542 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="extract-utilities" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.534588 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="extract-utilities" Mar 20 09:56:43 crc kubenswrapper[4958]: E0320 09:56:43.534670 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="registry-server" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.534702 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="registry-server" Mar 20 09:56:43 crc kubenswrapper[4958]: E0320 09:56:43.534725 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="extract-content" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.534740 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="extract-content" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.535108 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd62ee1d-41a1-4276-a2d5-42662160eed4" containerName="registry-server" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.537318 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.550932 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqxsr"] Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.667672 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-catalog-content\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.667809 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-utilities\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.668141 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zmfm\" (UniqueName: \"kubernetes.io/projected/a801e534-5d4a-4108-a681-f1f6645b90d2-kube-api-access-5zmfm\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.769542 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-catalog-content\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.769660 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-utilities\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.769772 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zmfm\" (UniqueName: \"kubernetes.io/projected/a801e534-5d4a-4108-a681-f1f6645b90d2-kube-api-access-5zmfm\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.770413 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-catalog-content\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.770432 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-utilities\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.799299 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zmfm\" (UniqueName: \"kubernetes.io/projected/a801e534-5d4a-4108-a681-f1f6645b90d2-kube-api-access-5zmfm\") pod \"redhat-marketplace-rqxsr\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:43 crc kubenswrapper[4958]: I0320 09:56:43.902150 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:44 crc kubenswrapper[4958]: I0320 09:56:44.375867 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqxsr"] Mar 20 09:56:45 crc kubenswrapper[4958]: I0320 09:56:45.215152 4958 generic.go:334] "Generic (PLEG): container finished" podID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerID="3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890" exitCode=0 Mar 20 09:56:45 crc kubenswrapper[4958]: I0320 09:56:45.215201 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerDied","Data":"3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890"} Mar 20 09:56:45 crc kubenswrapper[4958]: I0320 09:56:45.215229 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerStarted","Data":"20da30c9606eb0fc56518480dc51e1d7d30aaccafb5050403b0a8bf619b6cfd7"} Mar 20 09:56:45 crc kubenswrapper[4958]: I0320 09:56:45.400789 4958 scope.go:117] "RemoveContainer" containerID="9d55d80bbf3be6029382a4a7196dc90cb90da9def65df4da06e8e8924aa76d8a" Mar 20 09:56:46 crc kubenswrapper[4958]: I0320 09:56:46.228308 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerStarted","Data":"3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097"} Mar 20 09:56:47 crc kubenswrapper[4958]: I0320 09:56:47.242478 4958 generic.go:334] "Generic (PLEG): container finished" podID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerID="3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097" exitCode=0 Mar 20 09:56:47 crc kubenswrapper[4958]: I0320 09:56:47.242680 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerDied","Data":"3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097"} Mar 20 09:56:48 crc kubenswrapper[4958]: I0320 09:56:48.254355 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerStarted","Data":"196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1"} Mar 20 09:56:53 crc kubenswrapper[4958]: I0320 09:56:53.902578 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:53 crc kubenswrapper[4958]: I0320 09:56:53.903433 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:53 crc kubenswrapper[4958]: I0320 09:56:53.961208 4958 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:54 crc kubenswrapper[4958]: I0320 09:56:54.005104 4958 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rqxsr" podStartSLOduration=8.554728722 podStartE2EDuration="11.005073739s" podCreationTimestamp="2026-03-20 09:56:43 +0000 UTC" firstStartedPulling="2026-03-20 09:56:45.218012133 +0000 UTC m=+3425.540028091" lastFinishedPulling="2026-03-20 09:56:47.66835714 +0000 UTC m=+3427.990373108" observedRunningTime="2026-03-20 09:56:48.284730991 +0000 UTC m=+3428.606746949" watchObservedRunningTime="2026-03-20 09:56:54.005073739 +0000 UTC m=+3434.327089727" Mar 20 09:56:54 crc kubenswrapper[4958]: I0320 09:56:54.383046 4958 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:54 crc kubenswrapper[4958]: I0320 09:56:54.461957 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqxsr"] Mar 20 09:56:56 crc kubenswrapper[4958]: I0320 09:56:56.330446 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rqxsr" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="registry-server" containerID="cri-o://196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1" gracePeriod=2 Mar 20 09:56:56 crc kubenswrapper[4958]: I0320 09:56:56.521520 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:56:56 crc kubenswrapper[4958]: I0320 09:56:56.521631 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.280523 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.340986 4958 generic.go:334] "Generic (PLEG): container finished" podID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerID="196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1" exitCode=0 Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.341050 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerDied","Data":"196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1"} Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.341120 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rqxsr" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.341184 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rqxsr" event={"ID":"a801e534-5d4a-4108-a681-f1f6645b90d2","Type":"ContainerDied","Data":"20da30c9606eb0fc56518480dc51e1d7d30aaccafb5050403b0a8bf619b6cfd7"} Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.341216 4958 scope.go:117] "RemoveContainer" containerID="196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.360425 4958 scope.go:117] "RemoveContainer" containerID="3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.380740 4958 scope.go:117] "RemoveContainer" containerID="3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.405291 4958 scope.go:117] "RemoveContainer" containerID="196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1" Mar 20 09:56:57 crc kubenswrapper[4958]: E0320 09:56:57.405870 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1\": container with ID starting with 196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1 not found: ID does not exist" containerID="196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.405972 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zmfm\" (UniqueName: \"kubernetes.io/projected/a801e534-5d4a-4108-a681-f1f6645b90d2-kube-api-access-5zmfm\") pod \"a801e534-5d4a-4108-a681-f1f6645b90d2\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.405982 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1"} err="failed to get container status \"196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1\": rpc error: code = NotFound desc = could not find container \"196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1\": container with ID starting with 196afac50a3bcbba99e7da1f606fe4782e8abc80f81d527521975c4089bf6ad1 not found: ID does not exist" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.406032 4958 scope.go:117] "RemoveContainer" containerID="3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.406046 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-catalog-content\") pod \"a801e534-5d4a-4108-a681-f1f6645b90d2\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.406197 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-utilities\") pod \"a801e534-5d4a-4108-a681-f1f6645b90d2\" (UID: \"a801e534-5d4a-4108-a681-f1f6645b90d2\") " Mar 20 09:56:57 crc kubenswrapper[4958]: E0320 09:56:57.406988 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097\": container with ID starting with 3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097 not found: ID does not exist" containerID="3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.407073 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097"} err="failed to get container status \"3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097\": rpc error: code = NotFound desc = could not find container \"3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097\": container with ID starting with 3d424a538f06b6a8690db431bcefd7dc653ab2f9f3ec8b89792dbef86c01e097 not found: ID does not exist" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.407115 4958 scope.go:117] "RemoveContainer" containerID="3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890" Mar 20 09:56:57 crc kubenswrapper[4958]: E0320 09:56:57.407692 4958 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890\": container with ID starting with 3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890 not found: ID does not exist" containerID="3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.407756 4958 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890"} err="failed to get container status \"3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890\": rpc error: code = NotFound desc = could not find container \"3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890\": container with ID starting with 3f1089318b1d6df43d1be8f123d7917eada2ef317cdf331c97ce5edd7543e890 not found: ID does not exist" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.408269 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-utilities" (OuterVolumeSpecName: "utilities") pod "a801e534-5d4a-4108-a681-f1f6645b90d2" (UID: "a801e534-5d4a-4108-a681-f1f6645b90d2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.414407 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a801e534-5d4a-4108-a681-f1f6645b90d2-kube-api-access-5zmfm" (OuterVolumeSpecName: "kube-api-access-5zmfm") pod "a801e534-5d4a-4108-a681-f1f6645b90d2" (UID: "a801e534-5d4a-4108-a681-f1f6645b90d2"). InnerVolumeSpecName "kube-api-access-5zmfm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.434012 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a801e534-5d4a-4108-a681-f1f6645b90d2" (UID: "a801e534-5d4a-4108-a681-f1f6645b90d2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.509394 4958 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.509566 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zmfm\" (UniqueName: \"kubernetes.io/projected/a801e534-5d4a-4108-a681-f1f6645b90d2-kube-api-access-5zmfm\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.509662 4958 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a801e534-5d4a-4108-a681-f1f6645b90d2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.696892 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqxsr"] Mar 20 09:56:57 crc kubenswrapper[4958]: I0320 09:56:57.706837 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rqxsr"] Mar 20 09:56:58 crc kubenswrapper[4958]: I0320 09:56:58.451444 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" path="/var/lib/kubelet/pods/a801e534-5d4a-4108-a681-f1f6645b90d2/volumes" Mar 20 09:57:26 crc kubenswrapper[4958]: I0320 09:57:26.522216 4958 patch_prober.go:28] interesting pod/machine-config-daemon-kvsdf container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 09:57:26 crc kubenswrapper[4958]: I0320 09:57:26.523353 4958 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 09:57:26 crc kubenswrapper[4958]: I0320 09:57:26.523460 4958 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" Mar 20 09:57:26 crc kubenswrapper[4958]: I0320 09:57:26.524718 4958 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1e1383ccb143b7dd2d877c7552557c47060151058968b967f4edfee63a4ba3bc"} pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 09:57:26 crc kubenswrapper[4958]: I0320 09:57:26.524852 4958 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" podUID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerName="machine-config-daemon" containerID="cri-o://1e1383ccb143b7dd2d877c7552557c47060151058968b967f4edfee63a4ba3bc" gracePeriod=600 Mar 20 09:57:27 crc kubenswrapper[4958]: I0320 09:57:27.651163 4958 generic.go:334] "Generic (PLEG): container finished" podID="d3bb0dff-98a7-4359-841f-5fb469ebc3f4" containerID="1e1383ccb143b7dd2d877c7552557c47060151058968b967f4edfee63a4ba3bc" exitCode=0 Mar 20 09:57:27 crc kubenswrapper[4958]: I0320 09:57:27.651300 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerDied","Data":"1e1383ccb143b7dd2d877c7552557c47060151058968b967f4edfee63a4ba3bc"} Mar 20 09:57:27 crc kubenswrapper[4958]: I0320 09:57:27.652071 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-kvsdf" event={"ID":"d3bb0dff-98a7-4359-841f-5fb469ebc3f4","Type":"ContainerStarted","Data":"868bb794e6ab76a793fe1efde1f5ea485398d4bed2d3d3d950034af1fa2558ec"} Mar 20 09:57:27 crc kubenswrapper[4958]: I0320 09:57:27.652127 4958 scope.go:117] "RemoveContainer" containerID="2eed96f0bf21107b0947fdff43a0a024b227aa60e355a27bcd33654f9083402e" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.169817 4958 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566678-gmwnr"] Mar 20 09:58:00 crc kubenswrapper[4958]: E0320 09:58:00.171097 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="extract-utilities" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.171117 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="extract-utilities" Mar 20 09:58:00 crc kubenswrapper[4958]: E0320 09:58:00.171147 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="extract-content" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.171155 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="extract-content" Mar 20 09:58:00 crc kubenswrapper[4958]: E0320 09:58:00.171169 4958 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="registry-server" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.171178 4958 state_mem.go:107] "Deleted CPUSet assignment" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="registry-server" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.171393 4958 memory_manager.go:354] "RemoveStaleState removing state" podUID="a801e534-5d4a-4108-a681-f1f6645b90d2" containerName="registry-server" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.172098 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.175332 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.176323 4958 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.176902 4958 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-t4ttj" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.178307 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566678-gmwnr"] Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.236838 4958 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zlcn6\" (UniqueName: \"kubernetes.io/projected/cccb6736-9933-42fa-92fa-8774b5c2c7e4-kube-api-access-zlcn6\") pod \"auto-csr-approver-29566678-gmwnr\" (UID: \"cccb6736-9933-42fa-92fa-8774b5c2c7e4\") " pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.337983 4958 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zlcn6\" (UniqueName: \"kubernetes.io/projected/cccb6736-9933-42fa-92fa-8774b5c2c7e4-kube-api-access-zlcn6\") pod \"auto-csr-approver-29566678-gmwnr\" (UID: \"cccb6736-9933-42fa-92fa-8774b5c2c7e4\") " pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.359717 4958 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zlcn6\" (UniqueName: \"kubernetes.io/projected/cccb6736-9933-42fa-92fa-8774b5c2c7e4-kube-api-access-zlcn6\") pod \"auto-csr-approver-29566678-gmwnr\" (UID: \"cccb6736-9933-42fa-92fa-8774b5c2c7e4\") " pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.497392 4958 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.794709 4958 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566678-gmwnr"] Mar 20 09:58:00 crc kubenswrapper[4958]: I0320 09:58:00.971014 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" event={"ID":"cccb6736-9933-42fa-92fa-8774b5c2c7e4","Type":"ContainerStarted","Data":"0992b363cdfc550859b5f0263678dac730731c54087d93c5aa0f87bfe4316bc8"} Mar 20 09:58:02 crc kubenswrapper[4958]: I0320 09:58:02.992784 4958 generic.go:334] "Generic (PLEG): container finished" podID="cccb6736-9933-42fa-92fa-8774b5c2c7e4" containerID="babfc5da0bbebe2a3b00b09957677ee0c75e1a24e95c762cfe0ea3bbc38703bf" exitCode=0 Mar 20 09:58:02 crc kubenswrapper[4958]: I0320 09:58:02.992886 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" event={"ID":"cccb6736-9933-42fa-92fa-8774b5c2c7e4","Type":"ContainerDied","Data":"babfc5da0bbebe2a3b00b09957677ee0c75e1a24e95c762cfe0ea3bbc38703bf"} Mar 20 09:58:04 crc kubenswrapper[4958]: I0320 09:58:04.291928 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:04 crc kubenswrapper[4958]: I0320 09:58:04.412906 4958 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zlcn6\" (UniqueName: \"kubernetes.io/projected/cccb6736-9933-42fa-92fa-8774b5c2c7e4-kube-api-access-zlcn6\") pod \"cccb6736-9933-42fa-92fa-8774b5c2c7e4\" (UID: \"cccb6736-9933-42fa-92fa-8774b5c2c7e4\") " Mar 20 09:58:04 crc kubenswrapper[4958]: I0320 09:58:04.421329 4958 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cccb6736-9933-42fa-92fa-8774b5c2c7e4-kube-api-access-zlcn6" (OuterVolumeSpecName: "kube-api-access-zlcn6") pod "cccb6736-9933-42fa-92fa-8774b5c2c7e4" (UID: "cccb6736-9933-42fa-92fa-8774b5c2c7e4"). InnerVolumeSpecName "kube-api-access-zlcn6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 09:58:04 crc kubenswrapper[4958]: I0320 09:58:04.514669 4958 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zlcn6\" (UniqueName: \"kubernetes.io/projected/cccb6736-9933-42fa-92fa-8774b5c2c7e4-kube-api-access-zlcn6\") on node \"crc\" DevicePath \"\"" Mar 20 09:58:05 crc kubenswrapper[4958]: I0320 09:58:05.016075 4958 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" event={"ID":"cccb6736-9933-42fa-92fa-8774b5c2c7e4","Type":"ContainerDied","Data":"0992b363cdfc550859b5f0263678dac730731c54087d93c5aa0f87bfe4316bc8"} Mar 20 09:58:05 crc kubenswrapper[4958]: I0320 09:58:05.016138 4958 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0992b363cdfc550859b5f0263678dac730731c54087d93c5aa0f87bfe4316bc8" Mar 20 09:58:05 crc kubenswrapper[4958]: I0320 09:58:05.016213 4958 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566678-gmwnr" Mar 20 09:58:05 crc kubenswrapper[4958]: I0320 09:58:05.364900 4958 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-zcldt"] Mar 20 09:58:05 crc kubenswrapper[4958]: I0320 09:58:05.373480 4958 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566672-zcldt"] Mar 20 09:58:06 crc kubenswrapper[4958]: I0320 09:58:06.447944 4958 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f2cd05c-c17b-44dd-85b7-4e5e183846d0" path="/var/lib/kubelet/pods/7f2cd05c-c17b-44dd-85b7-4e5e183846d0/volumes" Mar 20 09:58:45 crc kubenswrapper[4958]: I0320 09:58:45.539733 4958 scope.go:117] "RemoveContainer" containerID="b72022e48421b3eafb6795a1a6172a762f4735cf5dd2185e38139c1d706e7436"